Good progress on learning Node.js. Continue the good work and don't forget to code it while watching tutorials. When I watch video tutorials, I always do the coding at the same time. I would vary a little bit from the tutorials to see whether I really understand it.
You should go back to "the code" all the time to see whether you understand it more from coding point of view. At the same time, identify which part of the code you don't understand, and look for tutorials to help you. You can always systematically learn a programming language (or linearly following tutorials lesson-by-lesson), and at the same time, you can learn it centered on an existing code and looking tutorials (non-linearly) to help you. I suggest that this week you might want to make more progress on the tutorials (how about 10 tutorials?) while pushing the understanding of the code much further.
Sunday, July 31, 2016
Summer Research Week 4 Homework
Notes from Videos
Understanding References to Ojects, thenewboston 5
We found that this video was very helpful because it helped us understand the difference between three equal signs and two equal signs. Everything was well explained in the video and we did not have any questions.
Understanding References to Ojects, thenewboston 5
- in node, everything is a reference
- getting a reference when you adjust variable
- equal signs
- two
- number can equal to string
- compare value
- three
- number is not equal to string
- compare values & types
- string =/= integer
- this
- "the thing that called it"
- global
- the whole program will call it
We found this video somewhat confusing because the explanation was somewhat repetitive. He went through the topic quickly and we think that we will need to dig deeper into this idea. We have a sense of what it is, but we are not that confident about it.
Prototype, thenewboston 7
- allows you to give new method and properties to things
- can set equal to a function
- += means add
- -= means subtract
- every thing has access to the prototype
- can set amount to properties
So far, this was our favorite video because it was entertaining and educational. We got a lot of information out of this; it was explained in a way we could both understand. The process to do this seems long, but it is just a lot of repetition.
Modules, thenewboston 8
- group similar codes together
- organizes
- to include modules in the main file
- export and import to original
- module.export
- tag on any variable
- only this variable's content will be exported
- other variables stay
- to import module
- var name = require ('./movie')
- requiring a file to be placed into the main
- look for movie in the directory
- never include .js when doing this
I think that this might be the most relevant to us right now and it seems very useful. We thought that maybe we could use modules to separate certain characteristics in a drone such as the pitch, yaw, and roll movements. These would all be characterized in movements which is one small part of the actual drone.
More on Modules, thenewboston 9
- module.exports = {insert the code}
- quicker and more efficient way to export modules
Quick video that taught us about about a more efficient way to export modules instead of the method in Modules, thenewboston 8. We could import multiple codes at once with this method without having to "tag on" variables one by one to import them. In this method, anything inside the bracket are objects that will be imported to the main file.
Shared State of Module, thenewboston 10
- default behavior is to share with other files that are imported
- makes it so one variable will be referenced
- can be useful for performance and memory
- it can be inconvenient for specific person's preferences
We understood what the video told us when it said that this could be both convenient or inconvenient when this is applied. Since one variable will be referenced and shared when it is imported using this method, the other variables will have the same state essentially. This could be easier when there is a scenario where repetition is necessary; it would make our lives a lot easier because everything will be done for you. However, this can be inconvenient if every variable was unique and did not share this common trait.
This week, we wanted to familiarize ourselves with coding. We watched tutorials from thenewboston to enhance our understanding of node.js because we will be using it. This YouTube channel was made node.js easier to comprehend and it was entertaining. We learned about referencing, modules, prototype, and the meaning of "this." There was a lot of information to soak in and we wanted to understand it, so we took our times and re-watched the videos to see if we absorbed most of the content. We spent most of the time this week looking at our previous assignments to recall any information that would be necessary to understand. After this assignment, we were coming up with ways we could apply this knowledge to our project.
Due to vacation and work, we do not think that we can use the test out the drone at the moment.
Monday, July 25, 2016
Summer Research Week 4 (07/25 - 07/29) and Beyond
Your group has started entering the core of your project - "the code" that autonomously navigate the drone based on the camera input. A few different technologies needed to be put together in order to solve the puzzle.
- AR Drone data and control communication protocol: You can find some info from the latter chapters of the AR Drone Developer Guide. You will receive navigation data and video stream from the drone, and after processing, send flight control commands to the drone. This process needs to happen in real time, i.e., within a fraction of second. The main issue the former team had was that the processing time took too long (or the drone responded too slow) and thus the drone drift away too much.
- Node.js: It's the language of "the code". You can continue watching the video tutorials to build up your programming skills. It will help you read/write the code more efficiently.
- Open CV: The most important part of the project is to make the drone "intelligent" enough to navigate by itself. It can be achieved by a technology called "computer vision (CV)". Computer vision is a branch of artificial intelligence that aims for making the computer understand what it "see". The are tons of CV techniques developed over decades, and you don't need to re-invent the wheel. Instead, you can download and import a library called OpenCV. There are many existing functions you can call to fulfill your needs, ranging from basic image processing to facial recognition. You can search the web to find many general OpenCV tutorials, and you may find some specific ones for OpenCV and Node.js. (Don't forget to add the new resources to the "Project Resource" page.)
- Navigation schemes: How could we make the drone accomplish specific tasks such as deliver payload from one room to another? It involves the navigation schemes. The former team has tried to use circular markers in the air (by front camera) and the circles-and-lines markers on the ground (by bottom camera). You might test theirs, modify, and develop your own.
- Control theory: It's aiming for making the flight smoothly from point A to point B. You might remember that the line-following robot. When the control of the line-following robot is not very good, the robot will turn left and right (overshoot) a lot and make the robot travel roughly. However, we can deal with this factor later after we make the drone work.
Your tasks for the rest of the summer is to go through many iterations among these technologies, and gradually, fully understand and implement the code. One efficient way to make tangible progress is to implement and test the code piece by piece onto the drone. Once you successfully understand and implement one piece of the code, you them move to the next one. It will be too complicated and hard to debug if you want to do them all. Could you continue posting your weekly progress according to this plan? And, I will give you comments accordingly. Good luck!
Wednesday, July 20, 2016
Summer Research Week 3 Homework: Code Notes
Began annotating the code from last year. Progress so far is attached.
//ardroneAutonomousControl.js
//image = 640x360
//Blob detection
//Horizontal tracking
//Vertical tracking
//Marks Radius
/* AGENDA
√ Find blobs
√ Record edge links
√ Test bottom camera
√ Test if edge link detection is done accurately by marking them NOTE: I'm wondering if links should store an edge? var, if edge finding is asynchronous at all.
√ Fix glitches with blob detecting
√ (skipping blobs)
√ (green on bottom and right borders of the image)
√ Record radii from center to edge links
√ Record max-min radial difference
√ Find blob with largest difference (not average difference)
√ Get blob line
√ Find blob line direction
√ Mark path
√ Test + fix path marking
√ Use Ø(droneDirection-blobDirection) to control Yaw angle
√ Use bottom camera
• Incorporate second color for junctions, with original functions
√ Try getting navdata for its orientation to control more accurately it's drift
√ Figure out how to read navdata (it's not a straight string...)
√ Use edge pixels when finding junctions and clean up analyzeBlobs()
√ Incorporate navdata to help hovering
√ Fix the "max call stack size exceeded" error: don't use recursion for finding blobs anymore.
√ Fix new errors with findBlobsNoRecursion(): out-of-bounds[√], infinitely-large-blob[√] = problem: pixels that are already links are added as news.
√ Look up Hough functions that could possibly find lines and replace findBlobsNoRecursion()
• Fix drone movement:
try not updating line data if no new line is found [x],
don't do any command other than HOVER 2x in a row [x],
allow drone to do same command twice w/ timer [?],
have path shoulders which help if the drone is lost [ ]
try sending initial navdata-enabling command to see if altitude and velocity data becomes available [ ]
> the command is:
> client.config('general:navdata_demo', 'FALSE');
*/
/* COLOR KEY:
WHITE: line marker
GRAY: junction marker
RED: radius
BLUE: center, best path estimation
YELLOW: path direction head
GREEN: edge
*/
var ardrone = require('ar-drone')
var jimp = require('./jimp-master/index.js')
var math = require('./mathjs-master/index.js') //Comprehensive math library (used for square root, exponents, absolute value, vector math, etc.)
//Navdata
var orientation = [0.000,0.000,0.000] //direction facing
var origin = [0,0,0] //start location
var client = ardrone.createClient()
var pngImage //640*360
var markerX = -1
var markerY = -1
var markerR = -1
var pathX = -1
var pathY = -1
var pathA = -1
var erosionFactor = 2
var count = 0
var skipSize = 10
var command = [0,0] //0,1,2,3,4
var pCommand = [0,0] //0,1,2,3,4 = HOVER,UP,RIGHT,DOWN,LEFT
var pCommandTimer = [0,0]; //counts how long the drone has been trying the same command
var timeOffCourse = 0;
var color1 = [240,100,100]
var color2 = [240,172,110]
var blobsFound = new BlobLibrary()
client.config("video:video_channel", 1)
var pngStream = client.getPngStream()
pngStream
.on("error", console.log)
.on("data", function(incoming) {
processImage(incoming)
})
client.on("navdata", function(navdata) {
getMotionData(navdata)
if (pCommand[0] == command[0]) {
pCommandTimer[0]++
}
else {
pCommandTimer[0] = 0
}
if (pCommandTimer[0] > 50) {
pCommand[0] = 0
}
else {
pCommand[0] = 0
}
if (pCommand[1] == command[1]) {
pCommandTimer[1]++
}
else {
pCommandTimer[1] = 0
}
if (pCommandTimer[1] > 45) {
pCommand[1] = 0
}
else {
pCommand[1] = 0
}
controlFlight()
count++
})
if (count < 30) {
client.takeoff()
}
//.................................................................... DECLARATION
function getMotionData(navdata) { //I wanted to stabilize the drone by countering it's lean
if (count > 10) {
if (count < 30) { //origin = beginning
origin[0] = navdata.demo.rotation.roll
origin[1] = navdata.demo.rotation.pitch
origin[2] = navdata.demo.rotation.yaw
}
else { //orientation = facing
orientation[0] = navdata.demo.rotation.roll
orientation[1] = navdata.demo.rotation.pitch
orientation[2] = navdata.demo.rotation.yaw
}
}
}
function controlFlight() { //Control drone based on given path (X,Y,A)
if (count < 500 && count > 50) {
if (pathA > -1 && pathX > -1 && pathY > -1) {
var distance = math.sqrt(math.pow(pathX-(640*0.5),2) + math.pow(pathY-(320*0.5),2))
var angleV = math.pi * 1.5
angleV = pathA - angleV
if (distance > 320/3) { //CENTER OVER THE PATH OR MOVE FORWARD
timeOffCourse++;
var xMore = false;
var xV = pathX - (640*0.5)
var yV = pathY - (320*0.5)
if (math.abs(xV) > math.abs(yV)) {
xMore = true;
}
xV /= math.abs(xV)
yV /= math.abs(yV)
if ((timeOffCourse*0.001) < 0.04) {
xV *= 0.05 - (timeOffCourse*0.0005)
yV *= 0.05 - (timeOffCourse*0.0005)
}
else {
xV *= 0.005; //0.01
yV *= 0.005;
}
if (xV > 0.0) {
command[0] = 2
}
else if (xV < 0.0) {
command[0] = 4
}
if (yV > 0.0) {
command[1] = 3
}
else if (yV < 0.0) {
command[1] = 1
}
client.stop()
if ((pCommand[1] == 0 || pCommand[1] != command[1]) && !xMore) {
if (command[1] == 1) {
client.front(math.abs(yV))
console.log("FRONT")
}
else if (command[1] == 3) {
client.back(math.abs(yV))
console.log("BACK")
}
}
if ((pCommand[0] == 0 || pCommand[0] != command[0]) && xMore) {
if (command[0] == 2) {
client.right(math.abs(xV))
console.log("RIGHT")
}
else if (command[0] == 4) {
client.left(math.abs(xV*1.5))
console.log("LEFT")
}
}
}
else {
timeOffCourse = 0;
if (distance < 320/3 && math.abs(angleV) > 0/*(math.pi*0.1)*/) { //ROTATE
client.stop()
if (math.abs(angleV) < (math.pi*0.5)) {
if (angleV > 0) {
client.clockwise(0.1)
console.log("CLOCK")
}
else if (angleV < 0) {
client.counterClockwise(0.1)
console.log("COUNTER")
}
}
else {
console.log("PATH IS PERPENDICULAR")
}
}
if (distance < 320/3) { //HOVER
// if (orientation[0] < origin[0]-4) {
// client.right(0.08)
// }
// else if (orientation[0] > origin[0]+4) {
// client.left(0.08)
// }
// if (orientation[1] < origin[1]-4) {
// client.back(0.08)
// }
// else if (orientation[1] >origin[1]+4) {
// client.front(0.08)
// }
client.stop()
client.front(0.02);
command = [0,0]
console.log("PATH FOUND :)") //found path
}
}
}
else { //HOVER
// if (orientation[0] < origin[0]-4) {
// client.right(0.08)
// }
// else if (orientation[0] > origin[0]+4) {
// client.left(0.08)
// }
// if (orientation[1] < origin[1]-4) {
// client.back(0.08)
// }
// else if (orientation[1] > origin[1]+4) {
// client.front(0.08)
// }
command = [0,0]
console.log("LOST :(") //can't locate path
}
}
else {
if ((count > 500 || count == 500) && count < 510) { //if count meets criteria, drone lands
client.stop()
client.land()
}
}
}
function processImage(input) { //Find path and junction in image
pngImage = input
jimp.read(pngImage, function(err, image) {
if (err) throw err
image = thresholdImage(image)
findBlobsNoRecursion(image)
analyzeBlobs()
var line = findLines()
// var marker = findJunctions()
//
// if (marker[0] > -1 && marker[1] > -1) {
// image.setPixelColor(jimp.rgbaToInt(255,0,0,255),marker[0],marker[1])
// for (var i=0; i<marker[2]; i++) {
// if (marker[0] + i + 1 < image.bitmap.width) {
// image.setPixelColor(jimp.rgbaToInt(255,0,0,255),marker[0]+i+1,marker[1])
// }
// }
// }
// else {
// //console.log("NO JUNCTIONS")
// }
if (line[0] > -1 && line[1] > -1 && line[2] > -1) {
var vectorX = math.cos(line[2]) * 1
var vectorY = math.sin(line[2]) * 1
for (var i=1; i<20; i++) {
image.setPixelColor(jimp.rgbaToInt(0,100,255,255),line[0] + math.round(vectorX*i),line[1] + math.round(vectorY*i))
}
image.setPixelColor(jimp.rgbaToInt(255,255,0,255),line[0] + math.round(vectorX*20),line[1] + math.round(vectorY*20))
pathX = line[0]
pathY = line[1]
pathA = line[2]
}
else {
//console.log("NO LINES")
}
markBlobs(image)
//image.write("./droneControlOutput/img_" + count + ".png")
// markerX = marker[0]
// markerY = marker[1]
// markerR = marker[2]
})
}
function thresholdImage(image) { //Color thresholding (looking at image to figure things out, color-wise)
for (var y = 0; y < image.bitmap.height - skipSize; y += skipSize) {
for (var x = 0; x < image.bitmap.width - skipSize; x += skipSize) {
var color = jimp.intToRGBA(image.getPixelColor(x,y))
if (color.r / color.b > (color1[0]/color1[2]) - 1.5 && color.r / color.b < (color1[0]/color1[2]) + 2.5 && color.r / color.g > (color1[0]/color1[1]) - 1 && color.r / color.g < (color1[0]/color1[1]) + 2.5) { //~ORANGE
image.setPixelColor(jimp.rgbaToInt(255,255,255,255),x,y)
}
/*else if (color.r / color.b > (color2[0]/color2[2]) - 0.5 && color.r / color.b < (color2[0]/color2[2]) + 0.5 && color.r / color.g > (color2[0]/color2[1]) - 0.5 && color.r / color.g < (color2[0]/color2[1]) + 0.5) { //GREEN
image.setPixelColor(jimp.rgbaToInt(100,100,100,255),x,y)
}*/
else {
image.setPixelColor(jimp.rgbaToInt(0,0,0,255),x,y)
}
}
}
return image
}
function findBlobsNoRecursion(image) { //Find groups of pixels of the same color (grouping colors)
blobsFound.blobs = [] //clear blobs from previous image
var pixNums = [0,0] //just to keep track of how many pixels were kept vs. how many were not after thresholding
for (var startY = 0; startY < image.bitmap.height - skipSize; startY += skipSize) { //Loop through all pixels (accounting for skipSize) in the image
for (var startX = 0; startX < image.bitmap.width - skipSize; startX += skipSize) {
var color = jimp.intToRGBA(image.getPixelColor(startX,startY)) //Get color of current pixel (startX,startY)
var inBlob = false
if (color.b > 0) { //**COMMENT NOT FOR MR LIN** type1 = 255, type2 = 100
pixNums[0]++
for (var i=0; i<blobsFound.blobs.length; i++) { //Loop through all blobs found so far to check if current pixel has already been used
for (var j=0; j<blobsFound.blobs[i].links.length && inBlob == false; j++) {
if (blobsFound.blobs[i].links[j].x == startX && blobsFound.blobs[i].links[j].y == startY) {
inBlob = true
}
}
}
}
else {
pixNums[1]++
}
if (inBlob == false && color.b > 0) { //If pixel is within threshold and not already used, then create a new blob
var edges = [] //A selection of links that will be used to find blob radii outside of findBlobsNoRecursion()
var links = [] //Points that will make up the new blob
var news = [] //Points that haven't been checked yet for new neighboring white pixels
news.push(new Link(startX,startY)) //Add first pixel to news
var iteration=0 //Just for me to see how long it takes for the program to finish the blob
while (news.length > 0) { //While there are still pixels whose neighbors are not checked...
var len = news.length //Number of pixels which, as of now, aren't checked
for (var i = len-1; i > -1; i--) { //Loop through current news[] pixels from last to first (won't include pixels added to the array later in the process)
var x = news[i].x //store location of new pixel to be checked
var y = news[i].y
if (y-skipSize > 0 && y+skipSize < image.bitmap.height && x-skipSize > 0 && x+skipSize < image.bitmap.width) { //make sure new pixel is not at the edge of the image
color = jimp.intToRGBA(image.getPixelColor(x,y-skipSize)) //START: check neighbor above
if (color.b == 255) { //if neighbor is white
var used = false
for (var j=0; j<news.length && used == false; j++) { //loop through new pixels
if (news[j].x == x && news[j].y == y-skipSize) { //check if neighbor is already added
used = true
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) { //loop through saved pixels (already in blob)
if (links[j].x == x && links[j].y == y-skipSize) { //check if neighbor is already used
used = true
}
}
if (used == false) {
news.push(new Link(x,y-skipSize)) //add neighbor to news[]
}
}
} //END: check neighbor above
color = jimp.intToRGBA(image.getPixelColor(x,y+skipSize)) //START: check neighbor below
if (color.b == 255) {
var used = false
for (var j=0; j<news.length && used == false; j++) {
if (news[j].x == x && news[j].y == y+skipSize) {
used = true
}
if (used) {
break
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) {
if (links[j].x == x && links[j].y == y+skipSize) {
used = true
}
}
if (used == false) {
news.push(new Link(x,y+skipSize))
}
}
} //END: check neighbor below
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y)) //START: check neighbor left
if (color.b == 255) {
var used = false
for (var j=0; j<news.length && used == false; j++) {
if (news[j].x == x-skipSize && news[j].y == y) {
used = true
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) {
if (links[j].x == x-skipSize && links[j].y == y) {
used = true
}
}
if (used == false) {
news.push(new Link(x-skipSize,y))
}
}
} //END: check neighbor left
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y)) //START: check neighbor right
if (color.b == 255) {
var used = false
for (var j=0; j<news.length && used == false; j++) {
if (news[j].x == x+skipSize && news[j].y == y) {
used = true
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) {
if (links[j].x == x+skipSize && links[j].y == y) {
used = true
}
}
if (used == false) {
news.push(new Link(x+skipSize,y))
}
}
} //END: check neighbor right
}
if (isEdge(image,x,y,1)) { //check if new pixel is an edge
edges.push(new Link(x,y)) //add new pixel to edges[] (for calculating blob's radii later)
}
links.push(news[i]) //add this pixel to the new blob
news.splice(i,1) //remove this pixel from news[], as it's now checked
}
iteration++
}
if (links.length > 5) { //only add blob if it's size is somewhat significant
//console.log("...BLOB ADDED @ " + startX + "," + startY) //print blob's initial point
blobsFound.addBlob(1) //add an empty blob (constructor is not currently important)
blobsFound.blobs[blobsFound.blobs.length-1].links = links //fill blob's links[] array
blobsFound.blobs[blobsFound.blobs.length-1].edges = edges //fill blob's edges[] array
}
else {
//console.log("BLOB TOO SMALL")
}
}
}
}
//console.log("+: " + pixNums[0] + ", -: " + pixNums[1]) //not important
}
function isEdge(image, x, y, type) { //Edges used for finding the radii of a blob
var neighbors = 0
var color
if (x+skipSize < image.bitmap.width && y-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y-skipSize))
if (color.b == 255) {
neighbors++
}
}
if (x+skipSize < image.bitmap.width) {
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y))
if (color.b == 255) {
neighbors++
}
}
if (x+skipSize < image.bitmap.width && y+skipSize < image.bitmap.height) {
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y+skipSize))
if (color.b == 255) {
neighbors++
}
}
if (y+skipSize < image.bitmap.height) {
color = jimp.intToRGBA(image.getPixelColor(x,y+skipSize))
if (color.b == 255) {
neighbors++
}
}
if (x-skipSize > 0 && y+skipSize < image.bitmap.height) {
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y+skipSize))
if (color.b == 255) {
neighbors++
}
}
if (x-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y))
if (color.b == 255) {
neighbors++
}
}
if (x-skipSize >0 && y-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y-skipSize))
if (color.b == 255) {
neighbors++
}
}
if (y-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x,y-skipSize))
if (color.b == 255) {
neighbors++
}
}
if (neighbors > 1 && neighbors < 7) {
return true
}
else {
return false
}
}
function markBlobs(image) { //Show where the program found blobs
for (var i=0; i<blobsFound.blobs.length; i++) {
if (blobsFound.blobs[i].links.length > 5) {
var location = [blobsFound.blobs[i].aspects[0],blobsFound.blobs[i].aspects[1]]
image.setPixelColor(jimp.rgbaToInt(0,100,255,255),math.round(location[0]),math.round(location[1]))
for (var j=0; j<blobsFound.blobs[i].edges.length; j++) {
location = [blobsFound.blobs[i].edges[j].x,blobsFound.blobs[i].edges[j].y]
image.setPixelColor(jimp.rgbaToInt(0,255,0,255),location[0],location[1])
}
}
}
}
function analyzeBlobs() { //Calculate data of a blob
for (var i=0; i<blobsFound.blobs.length; i++) {
blobsFound.blobs[i].calculateCenterRadii()
if (blobsFound.blobs[i].aspects[7] == 1) {
blobsFound.blobs[i].calculateLinearityDirection()
}
else if (blobsFound.blobs[i].aspects[7] == 2) {
blobsFound.blobs[i].calculateCircularity()
}
}
}
function findLines() { //Use blob data to find most likely path
var Lnum = 0;
var bestLine = [2]
bestLine[0] = 0
bestLine[1] = 0
for (var i=0; i<blobsFound.blobs.length; i++) {
if (blobsFound.blobs[i].aspects[7] == 1 && blobsFound.blobs[i].links.length > 10) {
if (blobsFound.blobs[i].aspects[5] > bestLine[0]) {
bestLine[0] = blobsFound.blobs[i].aspects[5]
bestLine[1] = i
}
Lnum++
}
}
if (blobsFound.blobs.length > 0 && Lnum > 0) {
var lineHeading = blobsFound.blobs[bestLine[1]].aspects[6]
var angleDifference = math.abs((math.pi*1.5) - lineHeading)
if (angleDifference > math.pi) {
angleDifference = (2*math.pi) - angleDifference
}
if (angleDifference > 0.5*math.pi) {
lineHeading += math.pi
}
if (lineHeading > 2*math.pi) {
lineHeading -= 2*math.pi
}
var lineData = [blobsFound.blobs[bestLine[1]].aspects[0],blobsFound.blobs[bestLine[1]].aspects[1],lineHeading]
}
else {
var lineData = [-1,-1,-1]
}
return lineData
}
function findJunctions() { //Use blob data to find most likely junction
var Jnum = 0
var bestCircularity = [2] //circularity, blob#
bestCircularity[0] = 20
bestCircularity[1] = 0
var bestDensity = [2] //density, blob#
bestDensity[0] = 0
bestDensity[1] = 0
var bestBlob = 0
for (var i=0; i<blobsFound.blobs.length; i++) {
if (blobsFound.blobs[i].aspects[7] == 2 && blobsFound.blobs[i].links.length > 20) {
Jnum++
var circularity = blobsFound.blobs[i].aspects[3]
if (circularity < bestCircularity[0]) {
bestCircularity[0] = circularity
bestCircularity[1] = i
bestBlob = i
}
var density = blobsFound.blobs[i].aspects[4] //Not used right now...
}
}
if (blobsFound.blobs.length > 0 && Jnum > 0) {
var junctionData = [blobsFound.blobs[bestBlob].aspects[0],blobsFound.blobs[bestBlob].aspects[1],blobsFound.blobs[bestBlob].aspects[2]]
}
else {
var junctionData =[-1,-1,-1]
}
return junctionData
}
function BlobLibrary() {
this.blobs = []
}
BlobLibrary.prototype.addBlob = function(color) {
this.blobs = this.blobs.concat(new Blob(color))
}
function Blob(color) {
this.links = []
this.edges = []
this.radii = []
this.aspects = [8]
this.aspects[0] = 320 //X
this.aspects[1] = 200 //Y
this.aspects[2] = 50 //R adius
this.aspects[3] = 3 //C ircularity
this.aspects[4] = 5 //D ensity
this.aspects[5] = 0 //L inearity
this.aspects[6] = 0 //A ngle
this.aspects[7] = color //C olor (1=line,2=junction)
}
Blob.prototype.addLink = function(x, y) {
this.links = this.links.concat(new Link(x, y))
}
Blob.prototype.addEdge = function(x, y) {
this.edges = this.edges.concat(new Link(x, y))
}
Blob.prototype.calculateCenterRadii = function() {
var X = 0
var Y = 0
var edgeRadii = [this.edges.length]
for (var i=0; i<this.links.length; i++) {
X += this.links[i].x
Y += this.links[i].y
}
X /= this.links.length
Y /= this.links.length
this.aspects[0] = X
this.aspects[1] = Y
for (var i=0; i<this.edges.length; i++) {
var edgeRadius = math.sqrt(math.pow(this.edges[i].x - this.aspects[0],2) + math.pow(this.edges[i].y - this.aspects[1],2))
edgeRadii[i] = edgeRadius
}
this.radii = edgeRadii
if (this.radii.length > 0) {
var avgRadius = 0
for (var i=0; i<this.radii.length; i++) {
avgRadius += this.radii[i]
}
avgRadius /= this.radii.length
this.aspects[2] = avgRadius
}
}
Blob.prototype.calculateCircularity = function() {
if (this.radii.length > 0) {
var avgDifference = 0
for (var i=0; i<this.radii.length; i++) {
avgDifference += (this.radii[i] - this.aspects[2])
}
avgDifference /= this.radii.length
this.aspects[3] = avgDifference
}
this.aspects[4] = this.links.length / this.aspects[2]
}
Blob.prototype.calculateLinearityDirection = function() {
var shortest = 700
var longest = 0
var arrow = [1,1]
for (var i=0; i<this.radii.length; i++) {
var edgeRadius = this.radii[i]
if (edgeRadius < shortest) {
shortest = edgeRadius
}
if (edgeRadius > longest) {
longest = edgeRadius
arrow[0] = this.edges[i].x - this.aspects[0]
arrow[1] = this.edges[i].y - this.aspects[1]
}
}
var linearity = longest - shortest
this.aspects[5] = linearity
var angle = math.atan2(math.abs(arrow[1]), math.abs(arrow[0]))
if (arrow[0] < 0 && arrow[1] > 0) {
angle = math.pi - angle
}
else if (arrow[0] < 0 && arrow[1] < 0) {
angle = math.pi + angle
}
else if (arrow[0] > 0 && arrow[1] < 0) {
angle = (2*math.pi) - angle
}
this.aspects[6] = angle
}
function Link(x, y) {
this.x = x
this.y = y
}
//ardroneAutonomousControl.js
//image = 640x360
//Blob detection
//Horizontal tracking
//Vertical tracking
//Marks Radius
/* AGENDA
√ Find blobs
√ Record edge links
√ Test bottom camera
√ Test if edge link detection is done accurately by marking them NOTE: I'm wondering if links should store an edge? var, if edge finding is asynchronous at all.
√ Fix glitches with blob detecting
√ (skipping blobs)
√ (green on bottom and right borders of the image)
√ Record radii from center to edge links
√ Record max-min radial difference
√ Find blob with largest difference (not average difference)
√ Get blob line
√ Find blob line direction
√ Mark path
√ Test + fix path marking
√ Use Ø(droneDirection-blobDirection) to control Yaw angle
√ Use bottom camera
• Incorporate second color for junctions, with original functions
√ Try getting navdata for its orientation to control more accurately it's drift
√ Figure out how to read navdata (it's not a straight string...)
√ Use edge pixels when finding junctions and clean up analyzeBlobs()
√ Incorporate navdata to help hovering
√ Fix the "max call stack size exceeded" error: don't use recursion for finding blobs anymore.
√ Fix new errors with findBlobsNoRecursion(): out-of-bounds[√], infinitely-large-blob[√] = problem: pixels that are already links are added as news.
√ Look up Hough functions that could possibly find lines and replace findBlobsNoRecursion()
• Fix drone movement:
try not updating line data if no new line is found [x],
don't do any command other than HOVER 2x in a row [x],
allow drone to do same command twice w/ timer [?],
have path shoulders which help if the drone is lost [ ]
try sending initial navdata-enabling command to see if altitude and velocity data becomes available [ ]
> the command is:
> client.config('general:navdata_demo', 'FALSE');
*/
/* COLOR KEY:
WHITE: line marker
GRAY: junction marker
RED: radius
BLUE: center, best path estimation
YELLOW: path direction head
GREEN: edge
*/
var ardrone = require('ar-drone')
var jimp = require('./jimp-master/index.js')
var math = require('./mathjs-master/index.js') //Comprehensive math library (used for square root, exponents, absolute value, vector math, etc.)
//Navdata
var orientation = [0.000,0.000,0.000] //direction facing
var origin = [0,0,0] //start location
var client = ardrone.createClient()
var pngImage //640*360
var markerX = -1
var markerY = -1
var markerR = -1
var pathX = -1
var pathY = -1
var pathA = -1
var erosionFactor = 2
var count = 0
var skipSize = 10
var command = [0,0] //0,1,2,3,4
var pCommand = [0,0] //0,1,2,3,4 = HOVER,UP,RIGHT,DOWN,LEFT
var pCommandTimer = [0,0]; //counts how long the drone has been trying the same command
var timeOffCourse = 0;
var color1 = [240,100,100]
var color2 = [240,172,110]
var blobsFound = new BlobLibrary()
client.config("video:video_channel", 1)
var pngStream = client.getPngStream()
pngStream
.on("error", console.log)
.on("data", function(incoming) {
processImage(incoming)
})
client.on("navdata", function(navdata) {
getMotionData(navdata)
if (pCommand[0] == command[0]) {
pCommandTimer[0]++
}
else {
pCommandTimer[0] = 0
}
if (pCommandTimer[0] > 50) {
pCommand[0] = 0
}
else {
pCommand[0] = 0
}
if (pCommand[1] == command[1]) {
pCommandTimer[1]++
}
else {
pCommandTimer[1] = 0
}
if (pCommandTimer[1] > 45) {
pCommand[1] = 0
}
else {
pCommand[1] = 0
}
controlFlight()
count++
})
if (count < 30) {
client.takeoff()
}
//.................................................................... DECLARATION
function getMotionData(navdata) { //I wanted to stabilize the drone by countering it's lean
if (count > 10) {
if (count < 30) { //origin = beginning
origin[0] = navdata.demo.rotation.roll
origin[1] = navdata.demo.rotation.pitch
origin[2] = navdata.demo.rotation.yaw
}
else { //orientation = facing
orientation[0] = navdata.demo.rotation.roll
orientation[1] = navdata.demo.rotation.pitch
orientation[2] = navdata.demo.rotation.yaw
}
}
}
function controlFlight() { //Control drone based on given path (X,Y,A)
if (count < 500 && count > 50) {
if (pathA > -1 && pathX > -1 && pathY > -1) {
var distance = math.sqrt(math.pow(pathX-(640*0.5),2) + math.pow(pathY-(320*0.5),2))
var angleV = math.pi * 1.5
angleV = pathA - angleV
if (distance > 320/3) { //CENTER OVER THE PATH OR MOVE FORWARD
timeOffCourse++;
var xMore = false;
var xV = pathX - (640*0.5)
var yV = pathY - (320*0.5)
if (math.abs(xV) > math.abs(yV)) {
xMore = true;
}
xV /= math.abs(xV)
yV /= math.abs(yV)
if ((timeOffCourse*0.001) < 0.04) {
xV *= 0.05 - (timeOffCourse*0.0005)
yV *= 0.05 - (timeOffCourse*0.0005)
}
else {
xV *= 0.005; //0.01
yV *= 0.005;
}
if (xV > 0.0) {
command[0] = 2
}
else if (xV < 0.0) {
command[0] = 4
}
if (yV > 0.0) {
command[1] = 3
}
else if (yV < 0.0) {
command[1] = 1
}
client.stop()
if ((pCommand[1] == 0 || pCommand[1] != command[1]) && !xMore) {
if (command[1] == 1) {
client.front(math.abs(yV))
console.log("FRONT")
}
else if (command[1] == 3) {
client.back(math.abs(yV))
console.log("BACK")
}
}
if ((pCommand[0] == 0 || pCommand[0] != command[0]) && xMore) {
if (command[0] == 2) {
client.right(math.abs(xV))
console.log("RIGHT")
}
else if (command[0] == 4) {
client.left(math.abs(xV*1.5))
console.log("LEFT")
}
}
}
else {
timeOffCourse = 0;
if (distance < 320/3 && math.abs(angleV) > 0/*(math.pi*0.1)*/) { //ROTATE
client.stop()
if (math.abs(angleV) < (math.pi*0.5)) {
if (angleV > 0) {
client.clockwise(0.1)
console.log("CLOCK")
}
else if (angleV < 0) {
client.counterClockwise(0.1)
console.log("COUNTER")
}
}
else {
console.log("PATH IS PERPENDICULAR")
}
}
if (distance < 320/3) { //HOVER
// if (orientation[0] < origin[0]-4) {
// client.right(0.08)
// }
// else if (orientation[0] > origin[0]+4) {
// client.left(0.08)
// }
// if (orientation[1] < origin[1]-4) {
// client.back(0.08)
// }
// else if (orientation[1] >origin[1]+4) {
// client.front(0.08)
// }
client.stop()
client.front(0.02);
command = [0,0]
console.log("PATH FOUND :)") //found path
}
}
}
else { //HOVER
// if (orientation[0] < origin[0]-4) {
// client.right(0.08)
// }
// else if (orientation[0] > origin[0]+4) {
// client.left(0.08)
// }
// if (orientation[1] < origin[1]-4) {
// client.back(0.08)
// }
// else if (orientation[1] > origin[1]+4) {
// client.front(0.08)
// }
command = [0,0]
console.log("LOST :(") //can't locate path
}
}
else {
if ((count > 500 || count == 500) && count < 510) { //if count meets criteria, drone lands
client.stop()
client.land()
}
}
}
function processImage(input) { //Find path and junction in image
pngImage = input
jimp.read(pngImage, function(err, image) {
if (err) throw err
image = thresholdImage(image)
findBlobsNoRecursion(image)
analyzeBlobs()
var line = findLines()
// var marker = findJunctions()
//
// if (marker[0] > -1 && marker[1] > -1) {
// image.setPixelColor(jimp.rgbaToInt(255,0,0,255),marker[0],marker[1])
// for (var i=0; i<marker[2]; i++) {
// if (marker[0] + i + 1 < image.bitmap.width) {
// image.setPixelColor(jimp.rgbaToInt(255,0,0,255),marker[0]+i+1,marker[1])
// }
// }
// }
// else {
// //console.log("NO JUNCTIONS")
// }
if (line[0] > -1 && line[1] > -1 && line[2] > -1) {
var vectorX = math.cos(line[2]) * 1
var vectorY = math.sin(line[2]) * 1
for (var i=1; i<20; i++) {
image.setPixelColor(jimp.rgbaToInt(0,100,255,255),line[0] + math.round(vectorX*i),line[1] + math.round(vectorY*i))
}
image.setPixelColor(jimp.rgbaToInt(255,255,0,255),line[0] + math.round(vectorX*20),line[1] + math.round(vectorY*20))
pathX = line[0]
pathY = line[1]
pathA = line[2]
}
else {
//console.log("NO LINES")
}
markBlobs(image)
//image.write("./droneControlOutput/img_" + count + ".png")
// markerX = marker[0]
// markerY = marker[1]
// markerR = marker[2]
})
}
function thresholdImage(image) { //Color thresholding (looking at image to figure things out, color-wise)
for (var y = 0; y < image.bitmap.height - skipSize; y += skipSize) {
for (var x = 0; x < image.bitmap.width - skipSize; x += skipSize) {
var color = jimp.intToRGBA(image.getPixelColor(x,y))
if (color.r / color.b > (color1[0]/color1[2]) - 1.5 && color.r / color.b < (color1[0]/color1[2]) + 2.5 && color.r / color.g > (color1[0]/color1[1]) - 1 && color.r / color.g < (color1[0]/color1[1]) + 2.5) { //~ORANGE
image.setPixelColor(jimp.rgbaToInt(255,255,255,255),x,y)
}
/*else if (color.r / color.b > (color2[0]/color2[2]) - 0.5 && color.r / color.b < (color2[0]/color2[2]) + 0.5 && color.r / color.g > (color2[0]/color2[1]) - 0.5 && color.r / color.g < (color2[0]/color2[1]) + 0.5) { //GREEN
image.setPixelColor(jimp.rgbaToInt(100,100,100,255),x,y)
}*/
else {
image.setPixelColor(jimp.rgbaToInt(0,0,0,255),x,y)
}
}
}
return image
}
function findBlobsNoRecursion(image) { //Find groups of pixels of the same color (grouping colors)
blobsFound.blobs = [] //clear blobs from previous image
var pixNums = [0,0] //just to keep track of how many pixels were kept vs. how many were not after thresholding
for (var startY = 0; startY < image.bitmap.height - skipSize; startY += skipSize) { //Loop through all pixels (accounting for skipSize) in the image
for (var startX = 0; startX < image.bitmap.width - skipSize; startX += skipSize) {
var color = jimp.intToRGBA(image.getPixelColor(startX,startY)) //Get color of current pixel (startX,startY)
var inBlob = false
if (color.b > 0) { //**COMMENT NOT FOR MR LIN** type1 = 255, type2 = 100
pixNums[0]++
for (var i=0; i<blobsFound.blobs.length; i++) { //Loop through all blobs found so far to check if current pixel has already been used
for (var j=0; j<blobsFound.blobs[i].links.length && inBlob == false; j++) {
if (blobsFound.blobs[i].links[j].x == startX && blobsFound.blobs[i].links[j].y == startY) {
inBlob = true
}
}
}
}
else {
pixNums[1]++
}
if (inBlob == false && color.b > 0) { //If pixel is within threshold and not already used, then create a new blob
var edges = [] //A selection of links that will be used to find blob radii outside of findBlobsNoRecursion()
var links = [] //Points that will make up the new blob
var news = [] //Points that haven't been checked yet for new neighboring white pixels
news.push(new Link(startX,startY)) //Add first pixel to news
var iteration=0 //Just for me to see how long it takes for the program to finish the blob
while (news.length > 0) { //While there are still pixels whose neighbors are not checked...
var len = news.length //Number of pixels which, as of now, aren't checked
for (var i = len-1; i > -1; i--) { //Loop through current news[] pixels from last to first (won't include pixels added to the array later in the process)
var x = news[i].x //store location of new pixel to be checked
var y = news[i].y
if (y-skipSize > 0 && y+skipSize < image.bitmap.height && x-skipSize > 0 && x+skipSize < image.bitmap.width) { //make sure new pixel is not at the edge of the image
color = jimp.intToRGBA(image.getPixelColor(x,y-skipSize)) //START: check neighbor above
if (color.b == 255) { //if neighbor is white
var used = false
for (var j=0; j<news.length && used == false; j++) { //loop through new pixels
if (news[j].x == x && news[j].y == y-skipSize) { //check if neighbor is already added
used = true
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) { //loop through saved pixels (already in blob)
if (links[j].x == x && links[j].y == y-skipSize) { //check if neighbor is already used
used = true
}
}
if (used == false) {
news.push(new Link(x,y-skipSize)) //add neighbor to news[]
}
}
} //END: check neighbor above
color = jimp.intToRGBA(image.getPixelColor(x,y+skipSize)) //START: check neighbor below
if (color.b == 255) {
var used = false
for (var j=0; j<news.length && used == false; j++) {
if (news[j].x == x && news[j].y == y+skipSize) {
used = true
}
if (used) {
break
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) {
if (links[j].x == x && links[j].y == y+skipSize) {
used = true
}
}
if (used == false) {
news.push(new Link(x,y+skipSize))
}
}
} //END: check neighbor below
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y)) //START: check neighbor left
if (color.b == 255) {
var used = false
for (var j=0; j<news.length && used == false; j++) {
if (news[j].x == x-skipSize && news[j].y == y) {
used = true
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) {
if (links[j].x == x-skipSize && links[j].y == y) {
used = true
}
}
if (used == false) {
news.push(new Link(x-skipSize,y))
}
}
} //END: check neighbor left
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y)) //START: check neighbor right
if (color.b == 255) {
var used = false
for (var j=0; j<news.length && used == false; j++) {
if (news[j].x == x+skipSize && news[j].y == y) {
used = true
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) {
if (links[j].x == x+skipSize && links[j].y == y) {
used = true
}
}
if (used == false) {
news.push(new Link(x+skipSize,y))
}
}
} //END: check neighbor right
}
if (isEdge(image,x,y,1)) { //check if new pixel is an edge
edges.push(new Link(x,y)) //add new pixel to edges[] (for calculating blob's radii later)
}
links.push(news[i]) //add this pixel to the new blob
news.splice(i,1) //remove this pixel from news[], as it's now checked
}
iteration++
}
if (links.length > 5) { //only add blob if it's size is somewhat significant
//console.log("...BLOB ADDED @ " + startX + "," + startY) //print blob's initial point
blobsFound.addBlob(1) //add an empty blob (constructor is not currently important)
blobsFound.blobs[blobsFound.blobs.length-1].links = links //fill blob's links[] array
blobsFound.blobs[blobsFound.blobs.length-1].edges = edges //fill blob's edges[] array
}
else {
//console.log("BLOB TOO SMALL")
}
}
}
}
//console.log("+: " + pixNums[0] + ", -: " + pixNums[1]) //not important
}
function isEdge(image, x, y, type) { //Edges used for finding the radii of a blob
var neighbors = 0
var color
if (x+skipSize < image.bitmap.width && y-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y-skipSize))
if (color.b == 255) {
neighbors++
}
}
if (x+skipSize < image.bitmap.width) {
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y))
if (color.b == 255) {
neighbors++
}
}
if (x+skipSize < image.bitmap.width && y+skipSize < image.bitmap.height) {
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y+skipSize))
if (color.b == 255) {
neighbors++
}
}
if (y+skipSize < image.bitmap.height) {
color = jimp.intToRGBA(image.getPixelColor(x,y+skipSize))
if (color.b == 255) {
neighbors++
}
}
if (x-skipSize > 0 && y+skipSize < image.bitmap.height) {
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y+skipSize))
if (color.b == 255) {
neighbors++
}
}
if (x-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y))
if (color.b == 255) {
neighbors++
}
}
if (x-skipSize >0 && y-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y-skipSize))
if (color.b == 255) {
neighbors++
}
}
if (y-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x,y-skipSize))
if (color.b == 255) {
neighbors++
}
}
if (neighbors > 1 && neighbors < 7) {
return true
}
else {
return false
}
}
function markBlobs(image) { //Show where the program found blobs
for (var i=0; i<blobsFound.blobs.length; i++) {
if (blobsFound.blobs[i].links.length > 5) {
var location = [blobsFound.blobs[i].aspects[0],blobsFound.blobs[i].aspects[1]]
image.setPixelColor(jimp.rgbaToInt(0,100,255,255),math.round(location[0]),math.round(location[1]))
for (var j=0; j<blobsFound.blobs[i].edges.length; j++) {
location = [blobsFound.blobs[i].edges[j].x,blobsFound.blobs[i].edges[j].y]
image.setPixelColor(jimp.rgbaToInt(0,255,0,255),location[0],location[1])
}
}
}
}
function analyzeBlobs() { //Calculate data of a blob
for (var i=0; i<blobsFound.blobs.length; i++) {
blobsFound.blobs[i].calculateCenterRadii()
if (blobsFound.blobs[i].aspects[7] == 1) {
blobsFound.blobs[i].calculateLinearityDirection()
}
else if (blobsFound.blobs[i].aspects[7] == 2) {
blobsFound.blobs[i].calculateCircularity()
}
}
}
function findLines() { //Use blob data to find most likely path
var Lnum = 0;
var bestLine = [2]
bestLine[0] = 0
bestLine[1] = 0
for (var i=0; i<blobsFound.blobs.length; i++) {
if (blobsFound.blobs[i].aspects[7] == 1 && blobsFound.blobs[i].links.length > 10) {
if (blobsFound.blobs[i].aspects[5] > bestLine[0]) {
bestLine[0] = blobsFound.blobs[i].aspects[5]
bestLine[1] = i
}
Lnum++
}
}
if (blobsFound.blobs.length > 0 && Lnum > 0) {
var lineHeading = blobsFound.blobs[bestLine[1]].aspects[6]
var angleDifference = math.abs((math.pi*1.5) - lineHeading)
if (angleDifference > math.pi) {
angleDifference = (2*math.pi) - angleDifference
}
if (angleDifference > 0.5*math.pi) {
lineHeading += math.pi
}
if (lineHeading > 2*math.pi) {
lineHeading -= 2*math.pi
}
var lineData = [blobsFound.blobs[bestLine[1]].aspects[0],blobsFound.blobs[bestLine[1]].aspects[1],lineHeading]
}
else {
var lineData = [-1,-1,-1]
}
return lineData
}
function findJunctions() { //Use blob data to find most likely junction
var Jnum = 0
var bestCircularity = [2] //circularity, blob#
bestCircularity[0] = 20
bestCircularity[1] = 0
var bestDensity = [2] //density, blob#
bestDensity[0] = 0
bestDensity[1] = 0
var bestBlob = 0
for (var i=0; i<blobsFound.blobs.length; i++) {
if (blobsFound.blobs[i].aspects[7] == 2 && blobsFound.blobs[i].links.length > 20) {
Jnum++
var circularity = blobsFound.blobs[i].aspects[3]
if (circularity < bestCircularity[0]) {
bestCircularity[0] = circularity
bestCircularity[1] = i
bestBlob = i
}
var density = blobsFound.blobs[i].aspects[4] //Not used right now...
}
}
if (blobsFound.blobs.length > 0 && Jnum > 0) {
var junctionData = [blobsFound.blobs[bestBlob].aspects[0],blobsFound.blobs[bestBlob].aspects[1],blobsFound.blobs[bestBlob].aspects[2]]
}
else {
var junctionData =[-1,-1,-1]
}
return junctionData
}
function BlobLibrary() {
this.blobs = []
}
BlobLibrary.prototype.addBlob = function(color) {
this.blobs = this.blobs.concat(new Blob(color))
}
function Blob(color) {
this.links = []
this.edges = []
this.radii = []
this.aspects = [8]
this.aspects[0] = 320 //X
this.aspects[1] = 200 //Y
this.aspects[2] = 50 //R adius
this.aspects[3] = 3 //C ircularity
this.aspects[4] = 5 //D ensity
this.aspects[5] = 0 //L inearity
this.aspects[6] = 0 //A ngle
this.aspects[7] = color //C olor (1=line,2=junction)
}
Blob.prototype.addLink = function(x, y) {
this.links = this.links.concat(new Link(x, y))
}
Blob.prototype.addEdge = function(x, y) {
this.edges = this.edges.concat(new Link(x, y))
}
Blob.prototype.calculateCenterRadii = function() {
var X = 0
var Y = 0
var edgeRadii = [this.edges.length]
for (var i=0; i<this.links.length; i++) {
X += this.links[i].x
Y += this.links[i].y
}
X /= this.links.length
Y /= this.links.length
this.aspects[0] = X
this.aspects[1] = Y
for (var i=0; i<this.edges.length; i++) {
var edgeRadius = math.sqrt(math.pow(this.edges[i].x - this.aspects[0],2) + math.pow(this.edges[i].y - this.aspects[1],2))
edgeRadii[i] = edgeRadius
}
this.radii = edgeRadii
if (this.radii.length > 0) {
var avgRadius = 0
for (var i=0; i<this.radii.length; i++) {
avgRadius += this.radii[i]
}
avgRadius /= this.radii.length
this.aspects[2] = avgRadius
}
}
Blob.prototype.calculateCircularity = function() {
if (this.radii.length > 0) {
var avgDifference = 0
for (var i=0; i<this.radii.length; i++) {
avgDifference += (this.radii[i] - this.aspects[2])
}
avgDifference /= this.radii.length
this.aspects[3] = avgDifference
}
this.aspects[4] = this.links.length / this.aspects[2]
}
Blob.prototype.calculateLinearityDirection = function() {
var shortest = 700
var longest = 0
var arrow = [1,1]
for (var i=0; i<this.radii.length; i++) {
var edgeRadius = this.radii[i]
if (edgeRadius < shortest) {
shortest = edgeRadius
}
if (edgeRadius > longest) {
longest = edgeRadius
arrow[0] = this.edges[i].x - this.aspects[0]
arrow[1] = this.edges[i].y - this.aspects[1]
}
}
var linearity = longest - shortest
this.aspects[5] = linearity
var angle = math.atan2(math.abs(arrow[1]), math.abs(arrow[0]))
if (arrow[0] < 0 && arrow[1] > 0) {
angle = math.pi - angle
}
else if (arrow[0] < 0 && arrow[1] < 0) {
angle = math.pi + angle
}
else if (arrow[0] > 0 && arrow[1] < 0) {
angle = (2*math.pi) - angle
}
this.aspects[6] = angle
}
function Link(x, y) {
this.x = x
this.y = y
}
Summer Research Week 3 Homework
Prezi Notes
- Research and Planning
- Flight
- Stabilization
- Camera
- Ultrasonic sensors
- Communication Protocols
- Handling information
- First Attempt
- use landmarks as indicators when traveling
- Node.js
- helps communicate and receives images
- Second Attempt
- following a marker instead of navigating to it
- Third Attempt
- switch camera view to bottom in order to follow a set path
- more difficult rea
- Our Final Product
- uses to computer to analyze where it is supposed to go
- Future Research
- use more markers to help the drone
- work around the problem
Video Notes
This week we looked at the code to see if there was anything familiar or something that we understood. We did not understand most of the code, but we were able to extract some information from them. We took notes on the code and we will be continuing to do this to see if we progressed throughout the summer. Adding on, we played around with node.js to see how it functioned. We shared info with each other and started to build on top of each other's ideas. I watched videos to understand node.js better because I did not feel comfortable with coding. Both of us also went through last year's presentation and we understood most of the slides.
- Important concepts, thenewboston 3
- making objects in node.js same as javascript
- typing in javascript = code that runs in browser
- unless wild loop
- usually completes
- node.js runs on server and continuously
- prints out return data in function
- any function without a specific return statement gives undefined
- can set variable equal to function
- function does not need name this way when variable
- node.js not necessary, anonymous function
- when a function is stored in a variable, you can use set functions
- useful to use it in other functions
- Handling requests, thenewboston 4
- speed and a lot of commands
- node.js fast and efficient
- setTimeout _____, 1,000 = 1 second then giving back order
- schedule code to happen at x seconds
- all of them are complete in this time
- other things can be completed when the setTimeout function is in operation
This week we looked at the code to see if there was anything familiar or something that we understood. We did not understand most of the code, but we were able to extract some information from them. We took notes on the code and we will be continuing to do this to see if we progressed throughout the summer. Adding on, we played around with node.js to see how it functioned. We shared info with each other and started to build on top of each other's ideas. I watched videos to understand node.js better because I did not feel comfortable with coding. Both of us also went through last year's presentation and we understood most of the slides.
Sunday, July 17, 2016
Summer Research Week 3 (07/18 - 07/22)
- Install the plug-in of Adobe Flash Player to view the presentation.
- It seems that you need more time to finish all the assignments from last week. There will be no new assignments this week. Also, try to spend some time picking up the Node.js. Take notes when you watch video tutorials.
Summer Research Week 2 Homework
Progress:
Began by attempting to open the presentation from last year and was presented by an error message for a missing plugin. Will attempt to figure out which plugin is missing later and reopening the presentation. Took a brief glance through last year's blog to get a sense of how posts should work. Unsure of what information to take, but it was interesting to see the progress that they made. I installed Node.js and downloaded the ar-drone module. Received warnings in Terminal when running the script to install the module, however. The warning given is attached below. I noticed that the second YouTube channel in the post had tutorials on installation, so I will watch it to check my installation process. Alan and I have also created a GitHub account.

Questions:
Began by attempting to open the presentation from last year and was presented by an error message for a missing plugin. Will attempt to figure out which plugin is missing later and reopening the presentation. Took a brief glance through last year's blog to get a sense of how posts should work. Unsure of what information to take, but it was interesting to see the progress that they made. I installed Node.js and downloaded the ar-drone module. Received warnings in Terminal when running the script to install the module, however. The warning given is attached below. I noticed that the second YouTube channel in the post had tutorials on installation, so I will watch it to check my installation process. Alan and I have also created a GitHub account.

Questions:
- How does the code from last year work?
Sunday, July 10, 2016
Summer Research Week 2 (07/11 - 07/15)
Prior Art
- Final Presentation: Start with the final presentation in details from last year. Pay attention to the problems they have encountered in each stage, and the solutions they have tried. Also pay attention to the "Resource" slide. You might want to include many useful links and update your "Project Resource" page.
- Blog: Browse through the team blog from last year. Pick up anything useful to you.
- Node.js: Download and install Node.js from nodejs.org.
- AR Drone Module: Get the ar-drone module from GitHub.
- Nodecopter: Use the code in Nodecopter to program the drone.
- Node.js Tutorials: You may start watching some tutorials of node.js at your own pace from YouTube channels such as thenewboston, Learning NodeJs, etc.
- Download the code from last year. Try to understand the high-level structure and function of the code.
Friday, July 8, 2016
Summer Research Week 1 Homework
Notes for AR Drone Developer Guide, Parrot [chapt. 1-3, 17 pages]
- AR Drone 2.0
- quadrotor
- opposite rotors are turning in the same direction
- two rotors go clockwise, two rotors go counterclockwise
- diagonal = opposites
- powered with engines with three phases current
- controlled by micro-controller
- automatically detects various engines that are connected and adjusts engine controls
- can detect scenarios to prevent damage
- signals whenever the battery is low to prevent damage
- comes with inertial measurment units that facilitates how the drone manuevers; it also helps with control and can sense height
- has a camera that can stream incoming images to device
- video frame rate starts off at 15 FPS and can go up to 30 FPS (frames per second)
- Maneuvering the drone
- changing the left and right rotor speeds the opposite way
- roll movement
- outcome, back and forth motion
- changing the front and rear speeds the opposite way
- pitch movement
- changing each rotor pair speed the opposite way
- yaw movement
- outcome, left and right motion
- remote control
- lever and trims, control UAV pitch, roll, yaw and throttle
- easy to do because sensors can help take-off, hover, trim, and land automatically
- pushing certain buttons can manuever the drone automatically; these settings can be tuned
- altitude
- yaw speed limit
- vertical speed limit
- AR. Drone 2.0 tilt angle limit
- host tilt angle limit
- Flying the Drone
- outdoor
- light and low wind drag configuration
- indoor
- has to be protected by external bumpers
- tags can be added to external hull to allow drones to easily detect one another via their cameras
- Client Device & Drone
- WiFi
- AR Drone 2.0 creates a WiFi network with ESSID
- it gives itself an odd IP address
- User connects device to the ESSID network provided by the drone
- Device requests an IP address from drone DHCP server
- Drone DHCP server gives the client an IP address with its own IP address and a numerical value between one and four
- Client device can send requests to AR. Drone IP address and its service ports
- Communication with Client Device
- UDP Port 5556
- crucial to the user
- commands are sent frequently
- helps with controlling and changing the drone
- "AT commands"
- UDP Port 5554
- information about drone is sent to the client device
- qualities such as position and speed
- commands are sent frequently
- navdata
- has prepared Navdata receiving and decoding system
- TCP Port 5555
- a video stream from AR. Drone
- images from the video stream can be decoded
- TCP Port 5559
- communication channel for critical data
- retrieves configuration data and figures out which information is important
- AR. Drone Vocabulary
- soft - drone-specific code
- common - header (.h) files that describe the communication structures used by the drone
- pack C structures when compiling them
- Liblardrone_tool - set of tools that makes the user's experience easier by making it simpler to use the drone
- Liblutils - set of tools used to write applications based on the AR. Drone
- FFMPEG - has complete snapshots of FFMPEG library with build scripts for AR. Drone applications
- ITTIAM - already written and highly optimized video decoding library for iOS and Android applications
- VPSDK - has libraries with different functions
- VPSTAGES - video processing pieces, allows the user to build a video processing pipeline
- VPOS - multiplatform (Linux/Windows/Parrot) wrappers for functions such as memory allocation and thread management
- VPCOM - multiplatform wrappers for communication through WiFi and Bluetooth
- VPAPI - helps to manage video pipelines and threads
- ardrone_tool.c - has a prepared ardone_tool_main C function which starts the WiFi network and communication with the drone
- UI - prepared gamepad management code
- AT - all functions that can be used to control the AR. Drone 2.0
- use AT command and is backed up with the correct syntax and sequencing number
- forwarded to the AT management thread
- academy - prepared downloading and uploading system for AR. Drone Academy
- also includes management of the photo shooting
- control - prepared AR. Drone configuration management tool
- AR. Drone 2.0 Threads
- ARDroneTool is part of the AR. Drone 2.0 Library and provides:
- AT command management thread
- collects the commands sent by the other threads and orders them with a specific sequence number
- navdata management thread
- receives navdata stream and decodes it
- provides a prepared navigation data through call-back
- video management thread
- receives the video stream
- provides a prepared video data through call-back
- video recorder thread
- manages the HD stream recording
- manages .mp4/.mov encapsulation
- control thread
- regulates the requests form other threads that send commands from the drone
- makes sure that the requests match up with the drone's settings
- threads for AR. Drone Academy
- receives the photoshooting (.jpg) by ftp protocool
- manages userbox binary data receives and uploads to AR. Academy server
- all threads are built to assist the drone through vp_com library which is in charge of reconnecting the drone when necessary
- all threads are enabled by ardone_tool_main which is provided in ardrone_tool.c file
- AR. Drone 2.0 for Apple iOS Devices
- AR. drone Engine (ControlEnginel folder) provides all the applications needed for iOS devices to use the device
- provides a common drone API and user interface
- only interface to drone from the iOS application
- purpose is to access the ARDroneLIB
- AR. Drone Engine automatically opens, receives, and decodes video stream coming from drone
- does not render the decoded frames on screen
- in charge of showing frames
- provides HUD with different input buttons and data about the condition of the drone
- back
- return to the home screen and pauses the ARDroneTool
- this setting can be stopped in HUD configuration
- settings
- shows the AR. Drone settings screen
- emergency
- puts an end to the AR. Drone engines no matter what it is doing
- switch
- alternates between the cameras of the drone
- this setting can be stopped in HUD configuration
- record
- begins and ends the recording
- this setting can be stopped in HUD configuration
- screenshot
- takes a photo from the front facing camera of the drone
- this setting can be stopped in HUD configuration
- take off/landing
- helps the drone either take off if it is on ground or land if it is flying
- the take-off button resets emergency state
- battery level
- the drone's battery is shown numerically
- this setting can be stopped in HUD configuration
- WiFi indicator
- provides a quality estimation of the WiFi link with drone
- USB indicator
- displays the record time on the USB drive if there is a USB plugged
- Warning message label
- CONTROL LINK NOT AVAILABLE
- wifi connection lost
- START NOT RECEIVED
- drone was not able to take off when commanded
- CUT OUT EMERGENCY
- motor(s) were stopped by environment
- MOTORS EMERGENCY
- motor(s) not responding to commands
- CAMERA EMERGENCY
- camera(s) not functioning
- PIC WATCHDOG EMERGENCY
- navboard is not responding
- PIC VERSION EMERGENCY
- navboard is not updated
- TOO MUCH ANGEL EMERGENCY
- drone's euler angles went too high
- motors are shut down to prevent damage
- BATTERY LOW EMERGENCY
- automatically lands the drone
- USER EMERGENCY
- emergency button pressed
- ULTRASOUND EMERGENCY
- sensor for ultrasound not functioning
- UNKNOWN EMERGENCY
- should not happen if applications are updated
- VIDEO CONNECTION ALERT
- stream cannot be received
- BATTERY LOW ALERT
- cannot take-off because low battery
- if flying, nothing happens
- ULTRASOUND ALERT
- cannot detect altitude of drone
- VISION ALERT
- cannot estimate speed from bottom facing camera
Notes from AR. Drone 2.0 Tutorial Video #1 and #2
- Connect AR. Drone to battery
- Lights on the drone's four motors will turn green when it is ready
- Connect the device to the drone via WiFi
- Use the application to use the drone
- Reset WiFi and reopen the application after update
- Buttons displayed on the device are used to control the drone's movement
- Displays on the right of Emergency
- use vertical camera under drone
- start and stop recording
- camera
- You can change settings from the pre-set beginner settings
- remember to adjust indoor or outdoor
- different hulls for indoor and outdoor
- Before taking off, make sure the drone is flat on the ground
- Press Flat Trim, calibrates AR Drone and confirms position
- Front camera is in front and stand behind
- Take Off
- Right Button, rotate and altitude
- Up or down, higher or lower
- Left or right, left or right
- Left Button
- tilt device forward to move around
- when not pressed then it hovers around the area
- joystick mode can be enabled in settings to avoid tilting device
- Absolute Control
- press calibration
- overcome difficulties position in flight
- Flip
- enable the function
- press screen twice rapidly
After reading the first three chapters of the guide, we felt overwhelmed because there was a lot of information to take in. There were so many new terms and functions that we had to know in order to control the drone. Information about the actual drone is not essential, but the warnings and the applications are definitely needed to maneuver the drone well. We also thought that the reading seemed biased when describing the quality of the drone; the guide claimed that the drone was super simple to use and efficient which could be exaggerated. The videos were extremely helpful because they summarized the information in the guide within 7 minutes. I found it more entertaining and it is easier to understand. We did not get to fly the drone because we could not meet up, but we completed the assignment.
Questions
- What is Navdata or Navboard?
- How does the drone give itself an IP address and why is it always odd?
- What is the purpose of the drone DHCP server giving a numerical value between one to four?
Sunday, July 3, 2016
Summer Research Week 1 (07/04 - 07/08)
AR Drone 2.0
- Read AR Drone Developer Guide, Parrot [chap. 1-3, 17 pages]. This guide will give you an overview of the structure, operations, hardware, software, and communication of AR Drone 2.0. Take notes on what you have learned.
- Download the FreeFlight app from the App Store (for iPhone/iPad) or from the Play Store (for Android). Watch AR Drone video Tutorials #1 and #2, and learn how to fly the drone. Identify the features and app options you want to try out later.
- Fly the drone! First, pay attention to all the safety concerns about yourself, the bystanders, the drone, and the surrounding environment before you fly. You can grab the drone while flying and turn it up-side-down which will cause it to enter an emergency mode which shuts down the motors. This is great for self-defense, as well as stopping a drone that is out of control. It's better to fly it in an open field. Start at low speed and limited height before you are trying more advanced movements. Go through all the basic movements (e.g., forward, backward, left, right, clockwise, counter-clockwise, up, down, etc) that the drone can do, and try to combine some of them to create more complex movements (e.g., forward and up, U-turn, spiral-up, etc.) Create various complex movements yourselves, and take notes on what are them and how to create them. These exercises will give you first-hand experience and understanding of the drone navigation. Latter on, your are going to make the drone perform these tasks autonomously. So, it's crucial for you to understand them thoroughly before you can program it.
- How to use Git and GitHub, Udacity. Sign up the Udacity for accessing to some of the free materials. Start this free course at your own pace to pick up the Git and GitHub. You are going to use them to handle your software projects.
Subscribe to:
Posts (Atom)