Firefighters often start helping from the floor that caused the fire, but this incident proved that it may not be the floor with maximum causalities. That got me thinking: What if firefighters could go to the floor that needed the most help first and as a result, saves more people! I was determined to do something that can empower firefighters to do their job optimally, aka save more people.
So with that, let's get started!
Drone with the Walabot Pro device would be useful for a commercial inspection Purposes. I have wanted to use an ultrasonic system but immediately discarded the idea because I knew propeller noise would prevent useful measurements from being made.
This drone can show where people/things are located within a building (or anywhere) by displaying an image with different colors based on density.
It uses a Walabot, Raspberry Pi that send information to the user.
Questions & Answers:How does it work? It works by using a Walabot, a 16-segment radar array that can "see" through solid objects. The code i'm using displayed objects by there density. A animal shows up more red on the screen because of the higher water content in their body making them more dense
Why is it Required? My reason was for firefighting and/or SAR, the drone (under legal privileges) will be able to fly above a burning building and show where people or pets are trapped.
.Are you intruding in my privacy? I know with the way people look at drones nowadays, they believe that they are just for spying on people. I understand that there are people that are inappropriate and should be punished, but most people flying drones are doing everything legal and appropriately. This drone will be only used for good purposes so you don't have to send me messages about how you think I'm breaking the law. (Check Your Local Regulations)
Problems:
Interference: The Walabot sometimes does pick up interference from the motors, luckily its very rare.
Battery Life: The drone is really heavy so the battery life is really bad. Unfortunately, this cannot be changed.
I will use a laptop to explore the Visual Studio 2015 example projects and develop my own project geared towards solving inspection drone problems. The things learned doing this will let me know how practical the system is and how much effort is worthwhile on the next phase of the project below.
What Walabot can see
- Metal Objects
- Water filled objects - including people
- PEX pipe material , even empty of water. I was impressed that empty plastic tubing is detected.
What Walabot can not see
wood sticks, these sticks are oven dried in production , so there is very little moisture and little material for Walabot to detect. Walabot could see a hardwood walking stick with a finish on it.
With Walabot Pro since the sensing is done with radio frequency energy none of these problems exist.
There are two problems that I hope to solve:
1) Collision Prevention
Avoiding crashes is the first priority, since the service being paid for is inspecting and not crashing,
2) Holding a fixed position with respect to the thing being inspected while hovering.
Done can hover precisely near the object that is being inspected, a great deal of help can be given to the pilot if the drone can be made aware of its environment.
Preliminary Conclusions
I have additional work to do proving the merit of changing the viewing angle of Walabot slightly and collecting more data to produce a composite picture of a scene with higher degrees of confidence on the placement of objects in the scene.
Some significant effort must be made to separate detected objects from noise when the detecting range extends to 2 to 3 meters. Walabot will be useful as a ranging and detection method used in conjunction with optical methods. The resolution does not seem high enough to provide a pilot assistance feature without also using optical methods. This should not be taken as a criticism of Walabot! I had no expectation that Walabot would replace optical sensors on inspection drones.
Visual Studio 2015 example projects
The three C++ example Microsoft Visual Studio projects, InWallApp, BreathingApp and Sensor_SampleCode from Walabot were used to get familiar with the Walabot API. They make a nice starting point for development of an App for commercial inspection drones.
Practical work towards a flight platform for Walabot Pro.
Setup the Walabot and Raspberry PiYou will need at least a 8gb flash drive and raspbian installed on it. Plug it into HDMI (If you have another Raspberry Pi you can configure the stuff on it first then switch the SD card to the Pi Zero) and wait for the GUI to start. Open the web browser and download the Walabot SDK for Raspberry Pi. Once it is installed you will need to add the following lines to the boot/config.txt file to support the Walabot
safe_mode_gpio=4
max_usb_current=1
Reboot the device now
Raspberry Pi
I am developing a Drone Walabot system using Raspberry Pi. Python seems to be a good way to develop an app that could be easily run on Rapberry Pi and rigorously tested on Windows or other platforms
1) Get the SDK for Raspberry Pi
2)Download the Raspberry Pi installer, a file named walabotSDK_RasbPi.deb
3)Install the SDK
sudo dpkg -i walabotSDK_RasbPi.deb
4) Build the example programs
sudo ./buildAll.sh
5) Run the example programs
./inWall.out./sensorTarget.out
A fairly complicated task remains to make the Walabot API remotely controlled through a communications link. Under consideration is the WiFi or Bluetooth wireless links available on Raspberry Pi 3 . An LTE modem is also under consideration. This is a fairly large task that I am not expecting to finish quickly.
Getting Started on using the Walabot API through a wireless link
I am guessing that there are more code examples for Raspberry Pi in Python than in C/C++, so my efforts will start with Python. I have already used Python to access GPIO on Raspberry Pi so I am confident that I can build a large system with servos that points Walabot takes measurements and then slighty changes the Walabot aim and then takes more measurements. This mimics the way animals use vision and hearing while hunting prey. I am also confident of using Python to make this system an IoT device. Here is a link for using the Walabot API from Python.
The python example codes were installed at the same time as C/C++ examples so there are no extra steps to get started with the Python examples. So start the Python IDE on Raspberry Pi and load the example code of your choice found in the directory /usr/share/doc/walabot/examples/python and start having fun.
Sensor App in Python is running!
It use a Walabot Pro to replace a camera on a remotely controlled gimbal camera mount. The gimbal will likely be very useful to make slight changes in the orientation of the Walabot Pro to help improve the confidence of object detection. This would be useful for preventing collisions, but I have a second concept that I prefer for commercial inspection work.It is a lower cost implementation that will satisfy a segment of the inspection drone market.
Attach the Walabot and Raspberry Pi to the DroneI used a transformer case as the mount for the Drone, It stuck good to the Walabot magnet and then used hot glue and super glue to ensure the mount won't come undone. Make sure you have the 5V, GND, and TV pins in place before you put everything together, it will make things easier for you.
WARNING: Be careful with the Walabot, you probably would be better to use gloves when handling it.
Setup the Amazon Alexa Skill and AWS Lambda FunctionI am Currently Working on Alexa Part.........Hope I will Finish It Soon :)
Example Conversation:
Human: "Alexa, open Spy Walabot".
Alexa: "Okay"
Human: "Alexa, Ask Spy Walabot, How Many People are in There?".
Alexa: "Okay"
Alexa: "Floor 3 People 8, Floor 6 People 3, Floor 8 People 2, Floor 7 People 1 "
The Alexa Skill allows through an Amazon Echo device. AWS Lambda is used to store the Alexa Skill code.
- Step 1: Click here and sign into the Amazon developer console or create an account.
- Step 2: Click on the "Your Alexa Dashboard," and then select the "Get Started" under Alexa Skills Kit in developers console.
- Step 3: Click "Add New Skill," and fill in the required skill information. Click "Save" then "Next."
- Step 4: Copy the contents of intentSchema.json from the downloaded Github repository into the Intent Schema field on the AWS developer console.
- Step 5: Add each of the slot types in the picture below, fill in the mode and the values on consecutive lines.
1 / 2
- Step 6: Copy the contents of sampleUtterances.txt into the Sample Utterances field on the AWS developer console. Click "Save" then "Next."
- Step 7: Go back to the AWS console and Click "Lambda."
- Step 8: Click "Create Function," and fill in the Name. Make sure the Runtime is "Node.js 6.10." For "Role." Select "Create a custom role," and fill in the information to match the picture below. Press "Allow."
1 / 2
- Step 9: Now change "Role" to "Choose an existing role." Select the role that you just created.
- Step 10: Click "Create Function."
- Step 11: Select "Alexa Skills Kit" from the list of triggers. Scroll down to select "Disable" Skill ID verification, and then click "Add" and "Save."
- Step 12: Click the Lambda function at the top of the tree in the Designer. Scroll down to "Function code."
- Step 13: Change the code entry type to "Upload a .ZIP file." Then upload the generated lambda.zip file and press "Save."
How the Architecture Works
1) The Walabot connected to the raspberry pi posts information to the server
2) The server stores the information in a MySQL Database
3) The Alexa intent is placed through the dash wand
4) It goes through the Alexa Skills Kit and the intent is passed to the server
5) The server calculates the amount of people on each floor and returns the floors in order of most people to least people
6) Emergency Services can save the most lives
Comments