Swarm Robots
Monday, November 7, 2016
Sunday, September 25, 2016
Tuesday, September 20, 2016
Initial Project Planning
1. Project Description and Merits
In this project, our group wants to design swarm robots that are able to communicate with one another to complete tasks. In our STEM project, we want to make it so that the robots can work with one another to clean the room up. We think that this could advance the field of STEM because it will enhance many people’s lives. No more vacuuming or mopping the floor because the swarm robots can do it quicker and more efficiently. This could also benefit AI because the robots will be able to comprehend what they are doing, and they will be responsible for doing that part. If our project can work for simple tasks like cleaning floors, it can be possible to make robots do other tasks, such as lifting heavy objects, attacking intruders, etc. These will all help make human life easier. From a business point of view, the swarm robots can help with efficiency because these swarm robots will eventually be able to bring food to sick patients in hospital, be security for their companies, etc.
2. Group/Team Communication
Ryan Atkinson, Alan Li and Jacob Shaw will be working on this project. We will communicate with one another via Skype.
3. Prior Work/Resource Inventory
People have been doing many things with swarm robots recently. There are videos on YouTube displaying their efficiency and helpfulness. These swarm robots/drones can build bridges on their own, play music, etc. There have been swarm robots holding a heavy object and bringing it to a certain person. Harvard also worked on swarm robots; these robots were able to make shapes by working with one another. We hope that in our STEM project, we can improve upon this by having swarm robots clean the floor; this project may also find something new that could benefit Swarm Robots.
4. Technology Analysis
Coding - need to program the robots so that they can either give out or carry out assignments. This has to be worked on a lot because we are not that familiar with coding, yet it is the most important part of this assignment. These robots need to have something to prompt them to do certain actions, and coding will provide that. We also have to make it possible for these robots to process their environment because every single one of them are responsible for one task. Image processing and communication is necessary. If these robots are not familiar with their surroundings and work on different assignments, then it will be a mess.
Mechanical Engineering - designing and building the robot. A robot’s structure can change the way it functions. If a robot doesn’t have wheels, you won’t expect it to move around efficiently. We need to learn how to design and construct a robot that will fulfill our needs, and we have some experience with this because Ryan and Jacob took robots.
5. Competence
Programming, having a specific language/platform for swarm robots
Engineering/Designing
Patience
Being organized
Being precise
6. Safety
If constructing from scratch, soldering irons may be required.
Be gentle with materials.
Use an open area when testing
Pack things up neatly
To avoid these issues, we need to make sure we have time at the end of class to clean up properly. We should also be gentle with the materials by not throwing them around and leaving them at a random place, where others can possibly break it. To have an open area while testing, we need to make sure no one else is walking around this area. If people are walking around and accidentally steps on one of the robots, then we will need to replace it.
7. Equipment, Materials, and Budget (list is subject to change)
Robotics Equipment
Arduino
~$500
8. Schedule
We want to make sure the central unit can send out information to the other robots. If we are able to do this, then we would like to have the central unit do this autonomously. We will be thinking about what platforms and what equipments we need to do the project this week. We will be researching on the topic and finding materials that may benefit us.
Sunday, September 18, 2016
Here is the presentation from Friday. Yay
https://docs.google.com/a/erhsnyc.net/presentation/d/16KGwEsMyo5s_kXiAvcHnqClPUcmdychUJgAoUpNroHU/edit?usp=sharing
https://docs.google.com/a/erhsnyc.net/presentation/d/16KGwEsMyo5s_kXiAvcHnqClPUcmdychUJgAoUpNroHU/edit?usp=sharing
Wednesday, August 31, 2016
Summer Research Weeks 7+8 Homework
AR Drone Guide Notes
Chapter 7 - Incoming Data Steam
- Navigation Data - navdata also known as navigation data is given an application continuously to keep up with the condition of the drone.
- Navigation Data Stream
- navdata sent from UDP Port 5554
- information is stored in a binary format
- made up of many data called options
- each option = 2 bytes (header)
- important options are
- navdata_demo_t
- navdata_cks_t
- navdata_host_angles_t
- navdata_vision_detect_t
- content is found in C structure primarily in navdata_common.h
- Initiating the reception of Navigation data
- to receive navigation data
- drone is in BOOTSTRAP mode when starting
- status and sequence counter set only
- to exit BOOTSTRAP mode, send AT command to modify the default settings on drone
- AT*CONFIG=\"general:navdata_demo\", \"TRUE\"\\r
- AT*CTRL=0
- drone is always started and navdata demos are send
- Augmented reality data stream
- drone detects up to four tags or oriented roundel
- The AR. Drone 1.0 video stream
- Image Structure
- image is split into group of blocks (GOB)
- split into Macroblocks to make a 16x16 image
- UVLC codec overview, UVLC codec close to JPEG
- P264 codec Review
- I Frames are complete frames, and it does not need any other frames to decode.
- P Frame use previous frames to predict
- use other frames as a reference to build upon
- best reference will be sent from the reference picture to the new image or P-picture
- Initiating the video stream
- client needs to send UDP packet on drone video port
- starts to stream
- if there is no connection between the client and drone
- stream ends
- The AR.Drone 2.0 video stream
- uses H264 (MPEG4.10 AVC) to video stream
- FPS, frames per second, 15 ~ 30
- Bitrate, 250 kbps and 4Mbps
- Resolution: 360p (640x360) or 720p (1280x720)
- On Apple products it will change
- Default 720p, 30FPS, 4Mbps
- Live stream uses MPEG4.2 Visual encoder
- can be adjusted between 15 to 30 fps and 250 kbps to 1Mbps
- video frames are sent with custom headers which informs the user about the frames
- headers can be found on page 68
- Network transmission of video stream
- transmitted on TCP socket 5555
- Drone immediately sends frames when connected to socket
- Since the frame can be sent in numerous TCP packets
- reassemble it by application before doing
- In ARDroneTool done within Video/video_stage_tcp.c file
- Latency reduction mecanism
- latency arises when TCP transmission permits application to all frames
- latency reduction mecanism from Video/video_stage_tcp.c file remove the older frames and send the newer ones to the decoder
- Video record stream
- uses TCP socket 5553 to send out H264-720p frames
- stream will not run if application is not running either
- converts H264 stream into files that are more accessible like .mov or .mp4 is done by Video/video_stage_encoded_recorder.c file
- uses utils/ardrone_video_atoms and utils/ardrone_video_encapsuler
Chapter 8 - Drone Configuration
- With ARDroneTool
- include <ardrone_tool/ardrone_tool_configuration.h> file into code to access ardrone_control_config structure
- has the configuration of drone
- Without ARDroneTool
- to get drone configuration without ARDroneTool, send AT*CRTL command with a mode parameter equaling 4
- sends on control communication port (TCP port 5559)
- Setting the drone configuration
- With ARDroneTool
- ARDRONE_TOOL_CONFIGURATION_ADDEVENT
- set configuration parameters
- allows the drone to understand new adjustments
- First parameter = name
- Second is a pointer of the new value that will be sent to drone
- Third is a callback saying that signals completion
- if success, 1 - if fail, 0 and repeats eventually
- From the Control Engine for iPhone
- Without ARDroneTool
- AT*CONFIG with correct sequence #, parameter note between double-quotes, and parameter value between double-quote to configure drone
- Multiconfiguration
- share AR.Drone with different configurations
- Configuration Keys:
- CAT_COMMON - default, all application
- holds config keys common to all applications and users
- CAT_APPLI - setting saved for current application
- application specific configuration
- video encoding & navdata_options
- CAT_USER - setting saved for current user
- switch active users at runtime for applications
- CAT_SESSION - setting saved for whole flight
- current flight settings
- active video camera & detection
- default setting after reboot or disconnect
- With ARDRoneTool
- ardrone_tool_init function takes 2 string pointers to application name and user
- sets ardrone_config_t structure called ardrone_application_default_config which holds default configuration
- sent to AR.Drone and overwrites default configuration
- Without ARDroneTool
- if new configuration, AR.Drone requires AT*CONFIG_IDS identifiers that match the default configuration before AT*CONFIG to change settings
- General Configuration - GENERAL:
- CAT_COMMON, read only
- num_version_config - congiuration subsystem version
- num_version_mb - drone motherboard hardware version
- num_version_soft - drone firmware version
- drone_serial - drone serial number
- soft_build_date - drone firmware compilation date
- motor1_soft - motor 1 board software version, applicable to other motors
- motor1_hard - motor 1 board hardware version, applicable to other motors
- motor1_supplier - motor 1 board supplier version, applicable to other motors
- flying_time - how long the drone spend flying in seconds
- vbat_min - minimum battery life before AR.Drone shutting down
- CAT_COMMON, read/write
- ardrone_name - name of AR.Drone
- AT command example: AT*CONFIG=605,"general:ardrone_name","My ARDrone Name"
- API use example: ARDRONE_TOOL_CONFIGURATION_ADDEVENT (ardrone_name, "My ARDrone Name", myCallback);
- navdata_demo - send navdata to clients or all available information that may or may not contain irrelevant information
- TRUE = reduced
- FALSE = all data
- AT command example: AT*CONFIG=605,"gneeral:navdata_demo","TRUE"
- API use example:
bool_t value = TRUE
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (navdata_demo, &value, myCallback); - com_watchdog - how long the drone can wait without being commanded by a client program, if exceeds then Com Watchdog triggered state (hovering)
- disabled at the moment
- video_enable - TRUE default, should not be FALSE (not implemented)
- vision_enable - TRUE default, should not be FALSE (not implemented)
- CAT_APPLI, read/write
- navdata_options - application asks for navdata packets, all navdata packets are found in navdata_common.h file
- AT command example: AT*CONFIG=605,"general:navdata_options","105971713"
- Control Configuration - CONTROL:
- CAT_COMMON, read only
- accs_offset - AR.Drone accelerometers offsets
- accs_gains - AR.Drone accelerometers gains
- gyros_offset - AR.Drone gyrometers offsets
- gyros_gains - AR.Drone gyrometers gains
- gyros110_offset -
- gryos110_gains -
- magneto_offset -
- magneto_radius -
- gyro_offset_thr_x - also for y and z axis
- pwm_ref_gyros -
- osctun-value -
- osctun_test -*All are Parrot internal debug informations*
- CAT_APPLI, read/write
- control_level - how drone interprets progressive commands from user
- Bit 0 is global enable bit, should be active
- Bit 1 is combined yaw mode, roll commands make roll+yaw based turns, good for racing
- AT command example: AT*CONFIG=605,"control:control_level","3"
- CAT_USER, read/write
- euler_angle_max - maximum bending angle in radians for pitch&roll angles
- prefer ardrone_at_set_progress_cmd_with_magneto for AR.Drone 2.0, AT command: AT*PCMD_MAG
- parameter is a value between 0 to 0.52
- AT*CONFIG=605,"control:euler_angle_max",".25"
- control_iphone_tilt
- angle in radians for iPhone acelerometer command
- on AR.FreeFlight, progressive command sent is between 0 and 1 for angles going from 0 to 90.
- AT command example: AT*CONFIG=605,"control:control_iphone_tilt",".25"
- API use example:
float iTiltMax = .25;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (control_iphone_tilt, &iTiltMax, myCallback); - control-vz_max
- maximum vertical speed of AR.Drone, millimeters per second, 200~2000
- AT command example: AT*CONFIG=605,"control:control_vz_max","1000"
- API use example:
uint32_t vzMax = 1000;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (control_vz_max, &vzMax, myCallback); - control_yaw - maximum yaw speed of AR.Drone in radians per second, .7 rads/s to 6.11 rads/s
- AT command example: AT*CONFIG=605,"control:control_yaw","3.0"
- API use example:
float yawSpeed = 3.0;
ARDRONE_TOOL_CONFIGURATION-ADDEVENT (control_yaw, &yawSpeed, myCallback); - indoor_euler_angle_max - used when CONTROl:outdoor is false
- indoor_control_vz_max - used when CONTROL:outdoor is false
- indoor_control_yaw - used when CONTROL:outdoor is false
- outdoor_euler_angle_max - used when CONTROL:outdoor is true
- outdoor_control_vz_max - used when CONTROL:outdoor is true
- outdoor_control_yaw - used when CONTROl:outdoor is true
- CAT_COMMON, read/write
- altitude_min - minimum drone altitude in millimeters
- AT command example: AT*CONFIG=605,"control:altitude_min","50"
- API use example:
uint32_t altMin = 50;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (altitude_min, &altMin, myCallback); - altitude_max - maximum drone altitude in millimeters
- AT command example: AT*CONFIG=605, "control:altitude_max","3000"
- API use example:
uint32_t altMax = 3000;
ARDRONE_TOOL-CONFIGURATION_ADDEVENT (altitude_max, &altMax, myCallback); - outdoor - tells control loop that AR.Drone is flying outside, adjusts to outdoor or indoor settings
- AT command example: AT*CONFIG=605,"control:outdoor","TRUE"
- API use example:
boot_t isOutdoor = TRUE;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (outdoor, &isOutdoor, myCallback); - flight_without_shell - tells control loop that AR.Drone is using outdoor hull, turn off when indoor hull different than outdoor setting and matches with setting for outdoor hull
- AT command example: AT*CONFIG=605,"control:flight_without_shell","TRUE"
- API use example:
boot_t withoutShell = TRUE;
ARDRONE_TOOL_CONFIGURATION-ADDEVENT (flight_without_shell, &withoutShell, myCallback); - autonomous_flight - no longer useful but previously used to put drone on autonomous flight mode
- flight_anim - launch drone animations, animations can be found in config.h file
- MAYDAY_TIMEOUT contains default duration for each flight animation
- AT command example: AT*CONFIG=605,"control:flight_anim","3,2"
- CAT_USER, read only
- manual_trim - should be only used if drone is using manual trims, should not be used - put FALSE setting
- CAT_SESSION, read/write
- flying_mode - has two flight modes which is either
- legacy FreeFlight mode where user controls the drone
- semi-autonomous mode, "HOVER_ON_TOP_OF_ROUNDEL" hover on top of ground tag
- AT command example: AT*CONFIG=605,"control:flying_mode","0"
- API use example:
FLYING_MODE fMode = FLYING_MODE_FREE_FLIGHT;
ARDRONE_TOOL_CONFIGURATION-ADDEVENT (flying_mode, &fMode, myCallback); - hovering_range - used when flying mode is "HOVER_ON_TOP_OF_(ORIENTED_)ROUNDEL"
- Network Configuration - NETWORK:
- CAT_COMMON, read/write
- ssid_single_player - applied when rebooted
- AT command example: AT*CONFIG=605,"network:ssid_single_player","myArdroneNetwork"
- API use example:
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (ssid_single_player, "myArdroneNetwork", myCallback); - ssid_multi_player - not usable
- wifi_mode - adjusts WiFi network, should not be changed
- 0 - drone is access point
- 1 - drone creates or joins network in Ad-Hoc mode
- 2 - drone join network as station
- wifi_rate - debug configuration, should not be modified
- owner_mac - MAC address pair with AR.Drone
- reset by 00:00:00:00:00:00
- AT command example: AT*CONFIG=605,"network:owner_mac","01:23:45:67:89:ab"
- API use example:
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (owner_mac, "cd:ef:01:23:45:67", myCallback); - Nav-board Configuration - PIC:
- CAT_COMMON, read/write
- ultrasound_freq
- frequency of ultrasound measures for altitude
- 22.22Hz or 25 Hz
- AT command example: AT*CONFIG=605,"pic:ultrasound_freq","7"
- API use example:
ADC_COMMANDS uFreq = ADC_CMD_SELECT_ULTRASOUND_22Hz;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (ultrasound_freq, &uFreq, myCallback); - values are found in ardrone_common_config.h file
- ultrasound_watchdog - should not be modified
- CAT_COMMON, read only
- pic_version - software version of Nav-board
- Video Configuration - VIDEO:
- CAT_COMMON, read only
- camif_fps - FPS of video interface, may differ from actual framerate
- camif_buffers - buffer depth for video interface
- num_trackers - number of tracking point for the speed estimation
- video_storage_space - size of wifi video record buffer
- CAT_COMMON, read/write
- codec_fps - current FPS of live video codec, max 30
- AT command example: AT*CONFIG=605,"video:codec_fps","30"
- API use example:
uint32_t codecFps = 30;
ARDRONE_TOOL_CONFIGURATION-ADDEVENT (codec_fps, &codecFps, myCallback); - video_on_usb - TRUE with USB key more than 100 MB of space, video stream will be placed there
- AT command example: AT*CONFIG=605,"video:video_on_usb","TRUE"
- API use example:
bool_t recordonUSB = TRUE;
ARDRONE_TOOL-CONFIGURATION_ADDEVENT (video_on_usb, &recordOnUsb, myCallback); - video_file_index - AR.Drone 2.0 number on last recorded video on USB key
- application should not adjust the value on key
- CAT_SESSION, read/write
- video_codec - current video codec of AR.Drone
- AR.Drone 2.0, start/stop record stream
- MP4_360P_CODEC - live stream with MPEG4.2 soft encoder, no record stream
- H264_360P_CODEC - live sream with H264 hardware encoder in 360p, no record stream
- MP4_360P_H264_720P_CODEC - live stream with MPEG4.2 soft encoder, record stream with H264 hardware encoder in 720p
- H264_720P_CODEC - live stream with H264 hardware encoder in 720p, no record stream
- MP4_360P_H264_360P_CODEC - live stream with MPEG4.2 soft encoder, record stream with H264 hardware encorder in 360p
- AT command example: AT*CONFIG=605,"video:video_codec","129"
- API use example:
codec_type_t newCodec = H264_360P_CODEC;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (video_codec, &newCodec, myCallback); - video_slices - should not be modified
- video_live_socket - should not be modified
- CAT_APPLI, read only, multiconfig - read/write
- bitrate - when using bitrate control in "VBC_MANUAL", bitrate of video transmission in kilobits per second (500~4000kbps)
- different when in VBC_MODE_DYNAMIC, will change kbps dynamically
- AT command example: AT*CONFIG=605,"video:bitrate","1000"
- API use example:
uint32_t newBitrate = 4000;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (bitrate, &newBitrate, myCallback); - bitrate_control_mode - enables automatic bitrate control of video stream and also enables configuration to reduce bandwith used by video stream under bad WiFi
- CAT_SESSION, read only, multiconfig - read/write
- bitrate_control_mode - maximum bitrate a device can handle
- VBC_MANUAL, maximum bitrate ignored
- VBC_MODE_DISABLED, maximum bitrate applied
- AT command example: AT*CONFIG=605,"video:max_bitrate","1000"
- API use example:
uint32_t newMaxBitrate = 4000;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (max_bitrate, &maxNewBitrate, myCallback); - CAT_APPLI, read/write
- bitrate_storage - for AR.Drone 2.0 Bitrate *kps) of recording stream
- both USB and WiFi record
- CAT_SESSION, read/write
- video_channel - video channel that will be sent to controller
- ZAP_CHANNEL_HORI
- ZAP_CHANNEL_VERT
- AT command example: AT*CONFIG=605,"video:video_channel","2"
- API use example:
ZAP_VIDEO_CHANNEL nextChannel = ZAP_CHANNEL_HORI;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (video_channel, &nextChannel, myCallback); - Leds Configuration - LEDS:
- CAT_COMMON, read/write
- use to launch leds animations
- animation names found in led_animation.h file
- animation number, frequency, duration
- AT command example: AT*CONFIG=605,"leds:leds_anim","3,1073741824,2"
- Detection Configuration - DETECT:
- CAT_COMMON, read/write
- enemy_colors - what color hulls you want to detect: green, yellow, blue - 1,2,3
- AT command example: AT*CONFIG=605,"detect:enemy_colors","2"
- API use example:
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (enemy_colors, &enemyColors, myCallback); - enemy_without_shell - to detect outdoor hulls, deactivate for indoor hulls
- AT command example: AT*CONFIG=605,"detect:enemy_without_shell","1"
- API use example:
uint32_t activated = 0;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (enemy_without_shell, &activated, myCallback); - CAT_SESSION, read/write
- detect_type - active tag detection, values at ardrone_api.h
- CAD_TYPE_NONE - no detection
- CAD_TYPE_MULTIPLE_DETECTION_MODE - configure detection on camera
- CAD_TYPE_ORIENTED_COCARDE_BW - black and white oriented roundel detected on bottom facing camera
- CAD_TYPE_VISION_V2 - standard tag detection
- AT command example: AT*CONFIG=605,"detect:detect_type","10"
- API use example:
CAD_TYPE detectType = CAD_TYPE_MULTIPLE_DETECTION_MODE;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (detect_type, &detectType, myCallback); - detections_select_h - bitfields to select detection that should be enabled on horizontal camera, values at ardrone_api.h
- TAG_TYPE_NONE - no tag detection
- TAG_TYPE_SHELL_TAG_V2 - standard indoor and outdoor hulls tag
- TAG_TYPE_BLACK_ROUNDEL - black and white oriented roundel
- AT command example: AT*CONFIG=605,"detect:detections_select_h","1"
- API use example:
uint32_t detectH = TAG_TYPE_MASK (TAG_TYPE_SHELL_TAG);
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (detections_select_h, &detectH, myCallback); - detections_select_v_hsync - bitfields to select detection that should be enabled on vertical camera
- syncs with hsync mode, 30fps which reduces CPU load
- AT command example: AT*CONFIG=605,"detect:detections_select_v_hsync","2"
- API use example:
uint32_t detectVhsync = TAG_TYPE_MASK (TAG_TYPE_ROUNDEL);
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (detections_select_v_hsync, &detectVhsync, myCallback); - detections_select_v - same as previous point but without the hsync mode
- runs at 60 Hz instead of 30 fps
- heavier on CPU
- should not be used with previous point
- AT command example: AT*CONFIG=605,"detect:detections_select_v","2"
- API use example:
uint32_t detectV = TAG_TYPE_MASK (TAG_TYPE_ROUNDEL);
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (detections_select_v, &detectV, myCallback); - USERBOX section - save navigation data from drone during a time period and take photos
- USERBOX:userbox_cmd
- USERBOX_CMD_STOP - stop userbox, no parameter
- USERBOX_CMD_CANCEL - cancel userbox, deletes content in userbox as well
- USERBOX_CMD_START - start userbox, uses date as a string parameter
- USERBOX_CMD_SCREENSHOT - takes photo from AR.Drone, 3 parameter
- delay - delay between each screenshot
- number of burst - # of screenshots
- current date - date of screenshot
- AT command example: AT*CONFIG=605,"userbox:userbox_cmd","0"
- GPS section - GPS:
- CAT_SESSION, read/write
- latitude - GPS Latitude sent by device
- used for media tagging and userbox recording
- AT command example: AT*CONFIG=605,"gps:latitude","4631107791820423168"
- API use example:
double gpsLatitutde = 42.0;
ARDRONE_TOOL-CONFIGURATION_ADDEVENT (latitude, &gpsLatitude, myCallback); - longitude - GPS Longitude sent by device
- used for media tagging and userbox recording
- AT command example: AT*CONFIG=605,"gps:longitude","461107791820423168"
- API use example:
double gpsLongitude = 42.0;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (longitude, &gpsLongitude, myCallback); - altitude - GPS Altitude sent by device
- used for media tagging and userbox recording
- AT command: example: AT*CONFIG=605,"gps:altitude","461107791820423168"
- API use example:
double gpsAltitude = 42.0;
ARDRONE_TOOL_CONFIGURATION_ADDEVENT (altitude, &gpsAltitude, myCallback);
We remembered how you mentioned that the previous group had trouble with flight control and processing time, so we decided to read the latter chapters of the guide. These two chapters a lot of information about the drone in general. We learned how to configure many settings in the drone - not necessarily the flight control or navigation data - and we think that it could help us in the future.
Sunday, August 14, 2016
Summer Research Week 6 Homework
Working with JSON Data, thenewboston 21
- need a .json file
- accessible on thenewboston github
- put it in main directory
- when copy and pasting code with messed up tabs
- crtl+a
- crtl+alt+l
- reformats code to organize
- local variables
- makes an "universal" variable for the whole program
- after app
- app.locals.any_word = variable
- makes something "local" to app
- type the word to display variable
- json = javascript object
- can set the local variable to a json file
- with json data, use forEach is used a lot to list out everything
In this video, we learned about json data which seems very useful. Using json data, we are able to list out multiple things and use a certain information repeatedly without having to type it out numerous times.
Passing JSON Data Using Routes, thenewboston 22
- code is specific to a single webpage
- put it in that route
- code that can be used everywhere
- app.js
- two dots means move up a directory
- finds the file from there
- var anything = require (json file)
- organizes the information
In this video, we learned how to organize the json data. It is crucial to be organized when coding and this tutorial emphasized it. Content inside of app.js are codes that apply to the whole program and everything else should be in their own file.
- require('module')
- necessary when referencing a specific module
- this can be stored into a variable
- require ('./file') can also be stored into a variable
- to install new modules, you can type npm install module_name
- typing node file in the terminal will execute the command
- ardrone_tool_set_ui_pad_start
- AT*REF
- int value: take off flag
- 1 to take-off
- 0 to land
- AT*REF = %d, %d<CR>
- sequence number, integer value representing a 32 bit-wide bit-field controlling the drone
- ardrone_tool_set_ui_pad_select
- AT*REF
- int value: emergency flag
- 1 to send emergency signal to the drone
- tells the drone to stop and land
- check navdata to see the drone's state
- 0 to stop sending emergency signal
- ardrone_at_set_progress_cmd
- AT*PCMD
- int flags
- enables progressive commands and the new Combined Yaw control mode
- float phi
- left or right bending angle
- 0 is the horizontal plane
- negative means left
- positive means right
- float theta
- front or back bending angle
- 0 is the horizontal plane
- negative means front
- positive means back
- float gaz
- the vertical speed of the drone
- float yaw
- the angular speed of the drone around the yaw-axis
- everything should be between the values -1 to 1
- ardrone_at_set_progress_cmd_with_magneto
- also moves the drone
- however, it allows the Absolute Control mode
- AT*PCMD_MAG
- same as the prior but has
- float magneto_psi
- angle of controlling device from north, value between -1 to 1
- float magneto_psi_accuracy
- the accuracy of the magneto_psi value between -1 to 1
- like the bullet point before this, this does nothing if the drone is on ground
- AT*PCMD_MAG=%d,%d,%d,%d,%d,%d,%d<CR>
- sequence number, flag enabling the progressive commands and/or Combined Yaw mode (bitfield), phi, theta, gaz, yaw, magneto psi, magneto psi accuracy
- ardrone_at_set_flat_trim
- AT*FTRIM=%d,<CR>
- sequence number
- sets a reference of the horizontal plane for the drone
- has to be called when drone is sitting on a horizontal ground
- stabilizes the drone when in air
- cannot be sent when the drone is already in air
- ardrone_at_set_calibration
- AT*CALIB=%d,%d,<CR>
- sequence number, identifier of the device to calibrate
- identifier can be chosen from ardrone_calibration_device_t
- calibrate drone magnetometer
- must be sent when drone is flying
- ardrone_at_set_toy_configuration
- AT*CONFIG=%d,%s,%s<CR>
- sequence number, option to set (byte with hex, value 22h), option value between double quotes
- AT*CONFIG_IDS=%d,%s,%s,%s<CR>
- sequence number, session id, user id, application id
- use during multiconfiguration
- only applied when the id match the AR.Drone
- ARDroneTool does this automatically
- AT*COMWDG
- reset communication watchdog
These were notes that we found online and we also looked at the ARDrone notes because we haven't seen it since the first few weeks. It was a nice change to just studying node.js. There is a lot of information inside the guide that we have to review and understand thoroughly. Just from Chapter 4 and 6 alone in the guide, we feel like there is an overwhelming amount of knowledge that we need to comprehend.
Sunday, August 7, 2016
Summer Research Week 5 Homework: Code Notes
Continued annotating the code from last year. Progress so far is attached.
We certainly understood the code more, but we are nowhere close to understanding the whole code. We understand the console.log and what require means now. We also have a clue on what's going on in the pCommand and pCommandTimer section.
//ardroneAutonomousControl.js
//image = 640x360
//Blob detection
//Horizontal tracking
//Vertical tracking
//Marks Radius
/* AGENDA
√ Find blobs
√ Record edge links
√ Test bottom camera
√ Test if edge link detection is done accurately by marking them NOTE: I'm wondering if links should store an edge? var, if edge finding is asynchronous at all.
√ Fix glitches with blob detecting
√ (skipping blobs)
√ (green on bottom and right borders of the image)
√ Record radii from center to edge links
√ Record max-min radial difference
√ Find blob with largest difference (not average difference)
√ Get blob line
√ Find blob line direction
√ Mark path
√ Test + fix path marking
√ Use Ø(droneDirection-blobDirection) to control Yaw angle
√ Use bottom camera
• Incorporate second color for junctions, with original functions
√ Try getting navdata for its orientation to control more accurately it's drift
√ Figure out how to read navdata (it's not a straight string...)
√ Use edge pixels when finding junctions and clean up analyzeBlobs()
√ Incorporate navdata to help hovering
√ Fix the "max call stack size exceeded" error: don't use recursion for finding blobs anymore.
√ Fix new errors with findBlobsNoRecursion(): out-of-bounds[√], infinitely-large-blob[√] = problem: pixels that are already links are added as news.
√ Look up Hough functions that could possibly find lines and replace findBlobsNoRecursion()
• Fix drone movement:
try not updating line data if no new line is found [x],
don't do any command other than HOVER 2x in a row [x],
allow drone to do same command twice w/ timer [?],
have path shoulders which help if the drone is lost [ ]
try sending initial navdata-enabling command to see if altitude and velocity data becomes available [ ]
> the command is:
> client.config('general:navdata_demo', 'FALSE');
*/
/* COLOR KEY:
WHITE: line marker
GRAY: junction marker
RED: radius
BLUE: center, best path estimation
YELLOW: path direction head
GREEN: edge
*/
var ardrone = require('ar-drone') //the varaible ardrone requires the information from ar-drone
var jimp = require('./jimp-master/index.js')
var math = require('./mathjs-master/index.js') //Comprehensive math library (used for square root, exponents, absolute value, vector math, etc.)
//Navdata
var orientation = [0.000,0.000,0.000] //direction facing
var origin = [0,0,0] //start location
var client = ardrone.createClient()
var pngImage //640*360
var markerX = -1
var markerY = -1
var markerR = -1
var pathX = -1
var pathY = -1
var pathA = -1
var erosionFactor = 2
var count = 0
var skipSize = 10
var command = [0,0] //0,1,2,3,4
var pCommand = [0,0] //0,1,2,3,4 = HOVER,UP,RIGHT,DOWN,LEFT
var pCommandTimer = [0,0]; //counts how long the drone has been trying the same command
var timeOffCourse = 0;
var color1 = [240,100,100]
var color2 = [240,172,110]
var blobsFound = new BlobLibrary()
client.config("video:video_channel", 1)
var pngStream = client.getPngStream()
pngStream
.on("error", console.log)
.on("data", function(incoming) {
processImage(incoming)
})
client.on("navdata", function(navdata) { //regulates the drone's movement?
getMotionData(navdata)
if (pCommand[0] == command[0]) { //if pCommand[0] is equal to command [0], add
pCommandTimer[0]++ //adds one to the commandTimer
}
else {
pCommandTimer[0] = 0 //stays the name
}
if (pCommandTimer[0] > 50) {
pCommand[0] = 0
}
else {
pCommand[0] = 0
}
if (pCommand[1] == command[1]) {
pCommandTimer[1]++
}
else {
pCommandTimer[1] = 0
}
if (pCommandTimer[1] > 45) {
pCommand[1] = 0
}
else {
pCommand[1] = 0
}
controlFlight()
count++
})
if (count < 30) {
client.takeoff()
}
//.................................................................... DECLARATION
function getMotionData(navdata) { //I wanted to stabilize the drone by countering it's lean
if (count > 10) {
if (count < 30) { //origin = beginning
origin[0] = navdata.demo.rotation.roll
origin[1] = navdata.demo.rotation.pitch
origin[2] = navdata.demo.rotation.yaw
}
else { //orientation = facing
orientation[0] = navdata.demo.rotation.roll
orientation[1] = navdata.demo.rotation.pitch
orientation[2] = navdata.demo.rotation.yaw
}
}
}
function controlFlight() { //Control drone based on given path (X,Y,A)
if (count < 500 && count > 50) {
if (pathA > -1 && pathX > -1 && pathY > -1) {
var distance = math.sqrt(math.pow(pathX-(640*0.5),2) + math.pow(pathY-(320*0.5),2))
var angleV = math.pi * 1.5
angleV = pathA - angleV
if (distance > 320/3) { //CENTER OVER THE PATH OR MOVE FORWARD
timeOffCourse++;
var xMore = false;
var xV = pathX - (640*0.5)
var yV = pathY - (320*0.5)
if (math.abs(xV) > math.abs(yV)) {
xMore = true;
}
xV /= math.abs(xV)
yV /= math.abs(yV)
if ((timeOffCourse*0.001) < 0.04) {
xV *= 0.05 - (timeOffCourse*0.0005)
yV *= 0.05 - (timeOffCourse*0.0005)
}
else {
xV *= 0.005; //0.01
yV *= 0.005;
}
if (xV > 0.0) {
command[0] = 2
}
else if (xV < 0.0) {
command[0] = 4
}
if (yV > 0.0) {
command[1] = 3
}
else if (yV < 0.0) {
command[1] = 1
}
client.stop()
if ((pCommand[1] == 0 || pCommand[1] != command[1]) && !xMore) {
if (command[1] == 1) {
client.front(math.abs(yV))
console.log("FRONT")
}
else if (command[1] == 3) {
client.back(math.abs(yV))
console.log("BACK")
}
}
if ((pCommand[0] == 0 || pCommand[0] != command[0]) && xMore) {
if (command[0] == 2) {
client.right(math.abs(xV))
console.log("RIGHT")
}
else if (command[0] == 4) {
client.left(math.abs(xV*1.5))
console.log("LEFT") //if certain conditions are met, then it will display which direction to the move the drone?
}
}
}
else {
timeOffCourse = 0;
if (distance < 320/3 && math.abs(angleV) > 0/*(math.pi*0.1)*/) { //ROTATE
client.stop()
if (math.abs(angleV) < (math.pi*0.5)) {
if (angleV > 0) {
client.clockwise(0.1)
console.log("CLOCK")
}
else if (angleV < 0) {
client.counterClockwise(0.1)
console.log("COUNTER")
}
}
else {
console.log("PATH IS PERPENDICULAR")
}
}
if (distance < 320/3) { //HOVER
// if (orientation[0] < origin[0]-4) {
// client.right(0.08)
// }
// else if (orientation[0] > origin[0]+4) {
// client.left(0.08)
// }
// if (orientation[1] < origin[1]-4) {
// client.back(0.08)
// }
// else if (orientation[1] >origin[1]+4) {
// client.front(0.08)
// }
client.stop()
client.front(0.02);
command = [0,0]
console.log("PATH FOUND :)") //found path
}
}
}
else { //HOVER
// if (orientation[0] < origin[0]-4) {
// client.right(0.08)
// }
// else if (orientation[0] > origin[0]+4) {
// client.left(0.08)
// }
// if (orientation[1] < origin[1]-4) {
// client.back(0.08)
// }
// else if (orientation[1] > origin[1]+4) {
// client.front(0.08)
// }
command = [0,0]
console.log("LOST :(") //can't locate path
}
}
else {
if ((count > 500 || count == 500) && count < 510) { //if count meets criteria, drone lands
client.stop()
client.land()
}
}
}
function processImage(input) { //Find path and junction in image
pngImage = input
jimp.read(pngImage, function(err, image) {
if (err) throw err
image = thresholdImage(image)
findBlobsNoRecursion(image)
analyzeBlobs()
var line = findLines()
// var marker = findJunctions()
//
// if (marker[0] > -1 && marker[1] > -1) {
// image.setPixelColor(jimp.rgbaToInt(255,0,0,255),marker[0],marker[1])
// for (var i=0; i<marker[2]; i++) {
// if (marker[0] + i + 1 < image.bitmap.width) {
// image.setPixelColor(jimp.rgbaToInt(255,0,0,255),marker[0]+i+1,marker[1])
// }
// }
// }
// else {
// //console.log("NO JUNCTIONS")
// }
if (line[0] > -1 && line[1] > -1 && line[2] > -1) {
var vectorX = math.cos(line[2]) * 1
var vectorY = math.sin(line[2]) * 1
for (var i=1; i<20; i++) {
image.setPixelColor(jimp.rgbaToInt(0,100,255,255),line[0] + math.round(vectorX*i),line[1] + math.round(vectorY*i))
}
image.setPixelColor(jimp.rgbaToInt(255,255,0,255),line[0] + math.round(vectorX*20),line[1] + math.round(vectorY*20))
pathX = line[0]
pathY = line[1]
pathA = line[2]
}
else {
//console.log("NO LINES")
}
markBlobs(image)
//image.write("./droneControlOutput/img_" + count + ".png")
// markerX = marker[0]
// markerY = marker[1]
// markerR = marker[2]
})
}
function thresholdImage(image) { //Color thresholding (looking at image to figure things out, color-wise)
for (var y = 0; y < image.bitmap.height - skipSize; y += skipSize) {
for (var x = 0; x < image.bitmap.width - skipSize; x += skipSize) {
var color = jimp.intToRGBA(image.getPixelColor(x,y))
if (color.r / color.b > (color1[0]/color1[2]) - 1.5 && color.r / color.b < (color1[0]/color1[2]) + 2.5 && color.r / color.g > (color1[0]/color1[1]) - 1 && color.r / color.g < (color1[0]/color1[1]) + 2.5) { //~ORANGE
image.setPixelColor(jimp.rgbaToInt(255,255,255,255),x,y)
}
/*else if (color.r / color.b > (color2[0]/color2[2]) - 0.5 && color.r / color.b < (color2[0]/color2[2]) + 0.5 && color.r / color.g > (color2[0]/color2[1]) - 0.5 && color.r / color.g < (color2[0]/color2[1]) + 0.5) { //GREEN
image.setPixelColor(jimp.rgbaToInt(100,100,100,255),x,y)
}*/
else {
image.setPixelColor(jimp.rgbaToInt(0,0,0,255),x,y)
}
}
}
return image
}
function findBlobsNoRecursion(image) { //Find groups of pixels of the same color (grouping colors)
blobsFound.blobs = [] //clear blobs from previous image
var pixNums = [0,0] //just to keep track of how many pixels were kept vs. how many were not after thresholding
for (var startY = 0; startY < image.bitmap.height - skipSize; startY += skipSize) { //Loop through all pixels (accounting for skipSize) in the image
for (var startX = 0; startX < image.bitmap.width - skipSize; startX += skipSize) {
var color = jimp.intToRGBA(image.getPixelColor(startX,startY)) //Get color of current pixel (startX,startY)
var inBlob = false
if (color.b > 0) { //**COMMENT NOT FOR MR LIN** type1 = 255, type2 = 100
pixNums[0]++
for (var i=0; i<blobsFound.blobs.length; i++) { //Loop through all blobs found so far to check if current pixel has already been used
for (var j=0; j<blobsFound.blobs[i].links.length && inBlob == false; j++) {
if (blobsFound.blobs[i].links[j].x == startX && blobsFound.blobs[i].links[j].y == startY) {
inBlob = true
}
}
}
}
else {
pixNums[1]++
}
if (inBlob == false && color.b > 0) { //If pixel is within threshold and not already used, then create a new blob
var edges = [] //A selection of links that will be used to find blob radii outside of findBlobsNoRecursion()
var links = [] //Points that will make up the new blob
var news = [] //Points that haven't been checked yet for new neighboring white pixels
news.push(new Link(startX,startY)) //Add first pixel to news
var iteration=0 //Just for me to see how long it takes for the program to finish the blob
while (news.length > 0) { //While there are still pixels whose neighbors are not checked...
var len = news.length //Number of pixels which, as of now, aren't checked
for (var i = len-1; i > -1; i--) { //Loop through current news[] pixels from last to first (won't include pixels added to the array later in the process)
var x = news[i].x //store location of new pixel to be checked
var y = news[i].y
if (y-skipSize > 0 && y+skipSize < image.bitmap.height && x-skipSize > 0 && x+skipSize < image.bitmap.width) { //make sure new pixel is not at the edge of the image
color = jimp.intToRGBA(image.getPixelColor(x,y-skipSize)) //START: check neighbor above
if (color.b == 255) { //if neighbor is white
var used = false
for (var j=0; j<news.length && used == false; j++) { //loop through new pixels
if (news[j].x == x && news[j].y == y-skipSize) { //check if neighbor is already added
used = true
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) { //loop through saved pixels (already in blob)
if (links[j].x == x && links[j].y == y-skipSize) { //check if neighbor is already used
used = true
}
}
if (used == false) {
news.push(new Link(x,y-skipSize)) //add neighbor to news[]
}
}
} //END: check neighbor above
color = jimp.intToRGBA(image.getPixelColor(x,y+skipSize)) //START: check neighbor below
if (color.b == 255) {
var used = false
for (var j=0; j<news.length && used == false; j++) {
if (news[j].x == x && news[j].y == y+skipSize) {
used = true
}
if (used) {
break
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) {
if (links[j].x == x && links[j].y == y+skipSize) {
used = true
}
}
if (used == false) {
news.push(new Link(x,y+skipSize))
}
}
} //END: check neighbor below
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y)) //START: check neighbor left
if (color.b == 255) {
var used = false
for (var j=0; j<news.length && used == false; j++) {
if (news[j].x == x-skipSize && news[j].y == y) {
used = true
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) {
if (links[j].x == x-skipSize && links[j].y == y) {
used = true
}
}
if (used == false) {
news.push(new Link(x-skipSize,y))
}
}
} //END: check neighbor left
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y)) //START: check neighbor right
if (color.b == 255) {
var used = false
for (var j=0; j<news.length && used == false; j++) {
if (news[j].x == x+skipSize && news[j].y == y) {
used = true
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) {
if (links[j].x == x+skipSize && links[j].y == y) {
used = true
}
}
if (used == false) {
news.push(new Link(x+skipSize,y))
}
}
} //END: check neighbor right
}
if (isEdge(image,x,y,1)) { //check if new pixel is an edge
edges.push(new Link(x,y)) //add new pixel to edges[] (for calculating blob's radii later)
}
links.push(news[i]) //add this pixel to the new blob
news.splice(i,1) //remove this pixel from news[], as it's now checked
}
iteration++
}
if (links.length > 5) { //only add blob if it's size is somewhat significant
//console.log("...BLOB ADDED @ " + startX + "," + startY) //print blob's initial point
blobsFound.addBlob(1) //add an empty blob (constructor is not currently important)
blobsFound.blobs[blobsFound.blobs.length-1].links = links //fill blob's links[] array
blobsFound.blobs[blobsFound.blobs.length-1].edges = edges //fill blob's edges[] array
}
else {
//console.log("BLOB TOO SMALL")
}
}
}
}
//console.log("+: " + pixNums[0] + ", -: " + pixNums[1]) //not important
}
function isEdge(image, x, y, type) { //Edges used for finding the radii of a blob
var neighbors = 0
var color
if (x+skipSize < image.bitmap.width && y-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y-skipSize))
if (color.b == 255) {
neighbors++
}
}
if (x+skipSize < image.bitmap.width) {
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y))
if (color.b == 255) {
neighbors++
}
}
if (x+skipSize < image.bitmap.width && y+skipSize < image.bitmap.height) {
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y+skipSize))
if (color.b == 255) {
neighbors++
}
}
if (y+skipSize < image.bitmap.height) {
color = jimp.intToRGBA(image.getPixelColor(x,y+skipSize))
if (color.b == 255) {
neighbors++
}
}
if (x-skipSize > 0 && y+skipSize < image.bitmap.height) {
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y+skipSize))
if (color.b == 255) {
neighbors++
}
}
if (x-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y))
if (color.b == 255) {
neighbors++
}
}
if (x-skipSize >0 && y-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y-skipSize))
if (color.b == 255) {
neighbors++
}
}
if (y-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x,y-skipSize))
if (color.b == 255) {
neighbors++
}
}
if (neighbors > 1 && neighbors < 7) {
return true
}
else {
return false
}
}
function markBlobs(image) { //Show where the program found blobs
for (var i=0; i<blobsFound.blobs.length; i++) {
if (blobsFound.blobs[i].links.length > 5) {
var location = [blobsFound.blobs[i].aspects[0],blobsFound.blobs[i].aspects[1]]
image.setPixelColor(jimp.rgbaToInt(0,100,255,255),math.round(location[0]),math.round(location[1]))
for (var j=0; j<blobsFound.blobs[i].edges.length; j++) {
location = [blobsFound.blobs[i].edges[j].x,blobsFound.blobs[i].edges[j].y]
image.setPixelColor(jimp.rgbaToInt(0,255,0,255),location[0],location[1])
}
}
}
}
function analyzeBlobs() { //Calculate data of a blob
for (var i=0; i<blobsFound.blobs.length; i++) {
blobsFound.blobs[i].calculateCenterRadii()
if (blobsFound.blobs[i].aspects[7] == 1) {
blobsFound.blobs[i].calculateLinearityDirection()
}
else if (blobsFound.blobs[i].aspects[7] == 2) {
blobsFound.blobs[i].calculateCircularity()
}
}
}
function findLines() { //Use blob data to find most likely path
var Lnum = 0;
var bestLine = [2]
bestLine[0] = 0
bestLine[1] = 0
for (var i=0; i<blobsFound.blobs.length; i++) {
if (blobsFound.blobs[i].aspects[7] == 1 && blobsFound.blobs[i].links.length > 10) {
if (blobsFound.blobs[i].aspects[5] > bestLine[0]) {
bestLine[0] = blobsFound.blobs[i].aspects[5]
bestLine[1] = i
}
Lnum++
}
}
if (blobsFound.blobs.length > 0 && Lnum > 0) {
var lineHeading = blobsFound.blobs[bestLine[1]].aspects[6]
var angleDifference = math.abs((math.pi*1.5) - lineHeading)
if (angleDifference > math.pi) {
angleDifference = (2*math.pi) - angleDifference
}
if (angleDifference > 0.5*math.pi) {
lineHeading += math.pi
}
if (lineHeading > 2*math.pi) {
lineHeading -= 2*math.pi
}
var lineData = [blobsFound.blobs[bestLine[1]].aspects[0],blobsFound.blobs[bestLine[1]].aspects[1],lineHeading]
}
else {
var lineData = [-1,-1,-1]
}
return lineData
}
function findJunctions() { //Use blob data to find most likely junction
var Jnum = 0
var bestCircularity = [2] //circularity, blob#
bestCircularity[0] = 20
bestCircularity[1] = 0
var bestDensity = [2] //density, blob#
bestDensity[0] = 0
bestDensity[1] = 0
var bestBlob = 0
for (var i=0; i<blobsFound.blobs.length; i++) {
if (blobsFound.blobs[i].aspects[7] == 2 && blobsFound.blobs[i].links.length > 20) {
Jnum++
var circularity = blobsFound.blobs[i].aspects[3]
if (circularity < bestCircularity[0]) {
bestCircularity[0] = circularity
bestCircularity[1] = i
bestBlob = i
}
var density = blobsFound.blobs[i].aspects[4] //Not used right now...
}
}
if (blobsFound.blobs.length > 0 && Jnum > 0) {
var junctionData = [blobsFound.blobs[bestBlob].aspects[0],blobsFound.blobs[bestBlob].aspects[1],blobsFound.blobs[bestBlob].aspects[2]]
}
else {
var junctionData =[-1,-1,-1]
}
return junctionData
}
function BlobLibrary() {
this.blobs = []
}
BlobLibrary.prototype.addBlob = function(color) {
this.blobs = this.blobs.concat(new Blob(color))
}
function Blob(color) {
this.links = []
this.edges = []
this.radii = []
this.aspects = [8]
this.aspects[0] = 320 //X
this.aspects[1] = 200 //Y
this.aspects[2] = 50 //R adius
this.aspects[3] = 3 //C ircularity
this.aspects[4] = 5 //D ensity
this.aspects[5] = 0 //L inearity
this.aspects[6] = 0 //A ngle
this.aspects[7] = color //C olor (1=line,2=junction)
}
Blob.prototype.addLink = function(x, y) {
this.links = this.links.concat(new Link(x, y))
}
Blob.prototype.addEdge = function(x, y) {
this.edges = this.edges.concat(new Link(x, y))
}
Blob.prototype.calculateCenterRadii = function() {
var X = 0
var Y = 0
var edgeRadii = [this.edges.length]
for (var i=0; i<this.links.length; i++) {
X += this.links[i].x
Y += this.links[i].y
}
X /= this.links.length
Y /= this.links.length
this.aspects[0] = X
this.aspects[1] = Y
for (var i=0; i<this.edges.length; i++) {
var edgeRadius = math.sqrt(math.pow(this.edges[i].x - this.aspects[0],2) + math.pow(this.edges[i].y - this.aspects[1],2))
edgeRadii[i] = edgeRadius
}
this.radii = edgeRadii
if (this.radii.length > 0) {
var avgRadius = 0
for (var i=0; i<this.radii.length; i++) {
avgRadius += this.radii[i]
}
avgRadius /= this.radii.length
this.aspects[2] = avgRadius
}
}
Blob.prototype.calculateCircularity = function() {
if (this.radii.length > 0) {
var avgDifference = 0
for (var i=0; i<this.radii.length; i++) {
avgDifference += (this.radii[i] - this.aspects[2])
}
avgDifference /= this.radii.length
this.aspects[3] = avgDifference
}
this.aspects[4] = this.links.length / this.aspects[2]
}
Blob.prototype.calculateLinearityDirection = function() {
var shortest = 700
var longest = 0
var arrow = [1,1]
for (var i=0; i<this.radii.length; i++) {
var edgeRadius = this.radii[i]
if (edgeRadius < shortest) {
shortest = edgeRadius
}
if (edgeRadius > longest) {
longest = edgeRadius
arrow[0] = this.edges[i].x - this.aspects[0]
arrow[1] = this.edges[i].y - this.aspects[1]
}
}
var linearity = longest - shortest
this.aspects[5] = linearity
var angle = math.atan2(math.abs(arrow[1]), math.abs(arrow[0]))
if (arrow[0] < 0 && arrow[1] > 0) {
angle = math.pi - angle
}
else if (arrow[0] < 0 && arrow[1] < 0) {
angle = math.pi + angle
}
else if (arrow[0] > 0 && arrow[1] < 0) {
angle = (2*math.pi) - angle
}
this.aspects[6] = angle
}
function Link(x, y) {
this.x = x
this.y = y
}
We certainly understood the code more, but we are nowhere close to understanding the whole code. We understand the console.log and what require means now. We also have a clue on what's going on in the pCommand and pCommandTimer section.
//ardroneAutonomousControl.js
//image = 640x360
//Blob detection
//Horizontal tracking
//Vertical tracking
//Marks Radius
/* AGENDA
√ Find blobs
√ Record edge links
√ Test bottom camera
√ Test if edge link detection is done accurately by marking them NOTE: I'm wondering if links should store an edge? var, if edge finding is asynchronous at all.
√ Fix glitches with blob detecting
√ (skipping blobs)
√ (green on bottom and right borders of the image)
√ Record radii from center to edge links
√ Record max-min radial difference
√ Find blob with largest difference (not average difference)
√ Get blob line
√ Find blob line direction
√ Mark path
√ Test + fix path marking
√ Use Ø(droneDirection-blobDirection) to control Yaw angle
√ Use bottom camera
• Incorporate second color for junctions, with original functions
√ Try getting navdata for its orientation to control more accurately it's drift
√ Figure out how to read navdata (it's not a straight string...)
√ Use edge pixels when finding junctions and clean up analyzeBlobs()
√ Incorporate navdata to help hovering
√ Fix the "max call stack size exceeded" error: don't use recursion for finding blobs anymore.
√ Fix new errors with findBlobsNoRecursion(): out-of-bounds[√], infinitely-large-blob[√] = problem: pixels that are already links are added as news.
√ Look up Hough functions that could possibly find lines and replace findBlobsNoRecursion()
• Fix drone movement:
try not updating line data if no new line is found [x],
don't do any command other than HOVER 2x in a row [x],
allow drone to do same command twice w/ timer [?],
have path shoulders which help if the drone is lost [ ]
try sending initial navdata-enabling command to see if altitude and velocity data becomes available [ ]
> the command is:
> client.config('general:navdata_demo', 'FALSE');
*/
/* COLOR KEY:
WHITE: line marker
GRAY: junction marker
RED: radius
BLUE: center, best path estimation
YELLOW: path direction head
GREEN: edge
*/
var ardrone = require('ar-drone') //the varaible ardrone requires the information from ar-drone
var jimp = require('./jimp-master/index.js')
var math = require('./mathjs-master/index.js') //Comprehensive math library (used for square root, exponents, absolute value, vector math, etc.)
//Navdata
var orientation = [0.000,0.000,0.000] //direction facing
var origin = [0,0,0] //start location
var client = ardrone.createClient()
var pngImage //640*360
var markerX = -1
var markerY = -1
var markerR = -1
var pathX = -1
var pathY = -1
var pathA = -1
var erosionFactor = 2
var count = 0
var skipSize = 10
var command = [0,0] //0,1,2,3,4
var pCommand = [0,0] //0,1,2,3,4 = HOVER,UP,RIGHT,DOWN,LEFT
var pCommandTimer = [0,0]; //counts how long the drone has been trying the same command
var timeOffCourse = 0;
var color1 = [240,100,100]
var color2 = [240,172,110]
var blobsFound = new BlobLibrary()
client.config("video:video_channel", 1)
var pngStream = client.getPngStream()
pngStream
.on("error", console.log)
.on("data", function(incoming) {
processImage(incoming)
})
client.on("navdata", function(navdata) { //regulates the drone's movement?
getMotionData(navdata)
if (pCommand[0] == command[0]) { //if pCommand[0] is equal to command [0], add
pCommandTimer[0]++ //adds one to the commandTimer
}
else {
pCommandTimer[0] = 0 //stays the name
}
if (pCommandTimer[0] > 50) {
pCommand[0] = 0
}
else {
pCommand[0] = 0
}
if (pCommand[1] == command[1]) {
pCommandTimer[1]++
}
else {
pCommandTimer[1] = 0
}
if (pCommandTimer[1] > 45) {
pCommand[1] = 0
}
else {
pCommand[1] = 0
}
controlFlight()
count++
})
if (count < 30) {
client.takeoff()
}
//.................................................................... DECLARATION
function getMotionData(navdata) { //I wanted to stabilize the drone by countering it's lean
if (count > 10) {
if (count < 30) { //origin = beginning
origin[0] = navdata.demo.rotation.roll
origin[1] = navdata.demo.rotation.pitch
origin[2] = navdata.demo.rotation.yaw
}
else { //orientation = facing
orientation[0] = navdata.demo.rotation.roll
orientation[1] = navdata.demo.rotation.pitch
orientation[2] = navdata.demo.rotation.yaw
}
}
}
function controlFlight() { //Control drone based on given path (X,Y,A)
if (count < 500 && count > 50) {
if (pathA > -1 && pathX > -1 && pathY > -1) {
var distance = math.sqrt(math.pow(pathX-(640*0.5),2) + math.pow(pathY-(320*0.5),2))
var angleV = math.pi * 1.5
angleV = pathA - angleV
if (distance > 320/3) { //CENTER OVER THE PATH OR MOVE FORWARD
timeOffCourse++;
var xMore = false;
var xV = pathX - (640*0.5)
var yV = pathY - (320*0.5)
if (math.abs(xV) > math.abs(yV)) {
xMore = true;
}
xV /= math.abs(xV)
yV /= math.abs(yV)
if ((timeOffCourse*0.001) < 0.04) {
xV *= 0.05 - (timeOffCourse*0.0005)
yV *= 0.05 - (timeOffCourse*0.0005)
}
else {
xV *= 0.005; //0.01
yV *= 0.005;
}
if (xV > 0.0) {
command[0] = 2
}
else if (xV < 0.0) {
command[0] = 4
}
if (yV > 0.0) {
command[1] = 3
}
else if (yV < 0.0) {
command[1] = 1
}
client.stop()
if ((pCommand[1] == 0 || pCommand[1] != command[1]) && !xMore) {
if (command[1] == 1) {
client.front(math.abs(yV))
console.log("FRONT")
}
else if (command[1] == 3) {
client.back(math.abs(yV))
console.log("BACK")
}
}
if ((pCommand[0] == 0 || pCommand[0] != command[0]) && xMore) {
if (command[0] == 2) {
client.right(math.abs(xV))
console.log("RIGHT")
}
else if (command[0] == 4) {
client.left(math.abs(xV*1.5))
console.log("LEFT") //if certain conditions are met, then it will display which direction to the move the drone?
}
}
}
else {
timeOffCourse = 0;
if (distance < 320/3 && math.abs(angleV) > 0/*(math.pi*0.1)*/) { //ROTATE
client.stop()
if (math.abs(angleV) < (math.pi*0.5)) {
if (angleV > 0) {
client.clockwise(0.1)
console.log("CLOCK")
}
else if (angleV < 0) {
client.counterClockwise(0.1)
console.log("COUNTER")
}
}
else {
console.log("PATH IS PERPENDICULAR")
}
}
if (distance < 320/3) { //HOVER
// if (orientation[0] < origin[0]-4) {
// client.right(0.08)
// }
// else if (orientation[0] > origin[0]+4) {
// client.left(0.08)
// }
// if (orientation[1] < origin[1]-4) {
// client.back(0.08)
// }
// else if (orientation[1] >origin[1]+4) {
// client.front(0.08)
// }
client.stop()
client.front(0.02);
command = [0,0]
console.log("PATH FOUND :)") //found path
}
}
}
else { //HOVER
// if (orientation[0] < origin[0]-4) {
// client.right(0.08)
// }
// else if (orientation[0] > origin[0]+4) {
// client.left(0.08)
// }
// if (orientation[1] < origin[1]-4) {
// client.back(0.08)
// }
// else if (orientation[1] > origin[1]+4) {
// client.front(0.08)
// }
command = [0,0]
console.log("LOST :(") //can't locate path
}
}
else {
if ((count > 500 || count == 500) && count < 510) { //if count meets criteria, drone lands
client.stop()
client.land()
}
}
}
function processImage(input) { //Find path and junction in image
pngImage = input
jimp.read(pngImage, function(err, image) {
if (err) throw err
image = thresholdImage(image)
findBlobsNoRecursion(image)
analyzeBlobs()
var line = findLines()
// var marker = findJunctions()
//
// if (marker[0] > -1 && marker[1] > -1) {
// image.setPixelColor(jimp.rgbaToInt(255,0,0,255),marker[0],marker[1])
// for (var i=0; i<marker[2]; i++) {
// if (marker[0] + i + 1 < image.bitmap.width) {
// image.setPixelColor(jimp.rgbaToInt(255,0,0,255),marker[0]+i+1,marker[1])
// }
// }
// }
// else {
// //console.log("NO JUNCTIONS")
// }
if (line[0] > -1 && line[1] > -1 && line[2] > -1) {
var vectorX = math.cos(line[2]) * 1
var vectorY = math.sin(line[2]) * 1
for (var i=1; i<20; i++) {
image.setPixelColor(jimp.rgbaToInt(0,100,255,255),line[0] + math.round(vectorX*i),line[1] + math.round(vectorY*i))
}
image.setPixelColor(jimp.rgbaToInt(255,255,0,255),line[0] + math.round(vectorX*20),line[1] + math.round(vectorY*20))
pathX = line[0]
pathY = line[1]
pathA = line[2]
}
else {
//console.log("NO LINES")
}
markBlobs(image)
//image.write("./droneControlOutput/img_" + count + ".png")
// markerX = marker[0]
// markerY = marker[1]
// markerR = marker[2]
})
}
function thresholdImage(image) { //Color thresholding (looking at image to figure things out, color-wise)
for (var y = 0; y < image.bitmap.height - skipSize; y += skipSize) {
for (var x = 0; x < image.bitmap.width - skipSize; x += skipSize) {
var color = jimp.intToRGBA(image.getPixelColor(x,y))
if (color.r / color.b > (color1[0]/color1[2]) - 1.5 && color.r / color.b < (color1[0]/color1[2]) + 2.5 && color.r / color.g > (color1[0]/color1[1]) - 1 && color.r / color.g < (color1[0]/color1[1]) + 2.5) { //~ORANGE
image.setPixelColor(jimp.rgbaToInt(255,255,255,255),x,y)
}
/*else if (color.r / color.b > (color2[0]/color2[2]) - 0.5 && color.r / color.b < (color2[0]/color2[2]) + 0.5 && color.r / color.g > (color2[0]/color2[1]) - 0.5 && color.r / color.g < (color2[0]/color2[1]) + 0.5) { //GREEN
image.setPixelColor(jimp.rgbaToInt(100,100,100,255),x,y)
}*/
else {
image.setPixelColor(jimp.rgbaToInt(0,0,0,255),x,y)
}
}
}
return image
}
function findBlobsNoRecursion(image) { //Find groups of pixels of the same color (grouping colors)
blobsFound.blobs = [] //clear blobs from previous image
var pixNums = [0,0] //just to keep track of how many pixels were kept vs. how many were not after thresholding
for (var startY = 0; startY < image.bitmap.height - skipSize; startY += skipSize) { //Loop through all pixels (accounting for skipSize) in the image
for (var startX = 0; startX < image.bitmap.width - skipSize; startX += skipSize) {
var color = jimp.intToRGBA(image.getPixelColor(startX,startY)) //Get color of current pixel (startX,startY)
var inBlob = false
if (color.b > 0) { //**COMMENT NOT FOR MR LIN** type1 = 255, type2 = 100
pixNums[0]++
for (var i=0; i<blobsFound.blobs.length; i++) { //Loop through all blobs found so far to check if current pixel has already been used
for (var j=0; j<blobsFound.blobs[i].links.length && inBlob == false; j++) {
if (blobsFound.blobs[i].links[j].x == startX && blobsFound.blobs[i].links[j].y == startY) {
inBlob = true
}
}
}
}
else {
pixNums[1]++
}
if (inBlob == false && color.b > 0) { //If pixel is within threshold and not already used, then create a new blob
var edges = [] //A selection of links that will be used to find blob radii outside of findBlobsNoRecursion()
var links = [] //Points that will make up the new blob
var news = [] //Points that haven't been checked yet for new neighboring white pixels
news.push(new Link(startX,startY)) //Add first pixel to news
var iteration=0 //Just for me to see how long it takes for the program to finish the blob
while (news.length > 0) { //While there are still pixels whose neighbors are not checked...
var len = news.length //Number of pixels which, as of now, aren't checked
for (var i = len-1; i > -1; i--) { //Loop through current news[] pixels from last to first (won't include pixels added to the array later in the process)
var x = news[i].x //store location of new pixel to be checked
var y = news[i].y
if (y-skipSize > 0 && y+skipSize < image.bitmap.height && x-skipSize > 0 && x+skipSize < image.bitmap.width) { //make sure new pixel is not at the edge of the image
color = jimp.intToRGBA(image.getPixelColor(x,y-skipSize)) //START: check neighbor above
if (color.b == 255) { //if neighbor is white
var used = false
for (var j=0; j<news.length && used == false; j++) { //loop through new pixels
if (news[j].x == x && news[j].y == y-skipSize) { //check if neighbor is already added
used = true
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) { //loop through saved pixels (already in blob)
if (links[j].x == x && links[j].y == y-skipSize) { //check if neighbor is already used
used = true
}
}
if (used == false) {
news.push(new Link(x,y-skipSize)) //add neighbor to news[]
}
}
} //END: check neighbor above
color = jimp.intToRGBA(image.getPixelColor(x,y+skipSize)) //START: check neighbor below
if (color.b == 255) {
var used = false
for (var j=0; j<news.length && used == false; j++) {
if (news[j].x == x && news[j].y == y+skipSize) {
used = true
}
if (used) {
break
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) {
if (links[j].x == x && links[j].y == y+skipSize) {
used = true
}
}
if (used == false) {
news.push(new Link(x,y+skipSize))
}
}
} //END: check neighbor below
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y)) //START: check neighbor left
if (color.b == 255) {
var used = false
for (var j=0; j<news.length && used == false; j++) {
if (news[j].x == x-skipSize && news[j].y == y) {
used = true
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) {
if (links[j].x == x-skipSize && links[j].y == y) {
used = true
}
}
if (used == false) {
news.push(new Link(x-skipSize,y))
}
}
} //END: check neighbor left
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y)) //START: check neighbor right
if (color.b == 255) {
var used = false
for (var j=0; j<news.length && used == false; j++) {
if (news[j].x == x+skipSize && news[j].y == y) {
used = true
}
}
if (used == false) {
for (var j=0; j<links.length && used == false; j++) {
if (links[j].x == x+skipSize && links[j].y == y) {
used = true
}
}
if (used == false) {
news.push(new Link(x+skipSize,y))
}
}
} //END: check neighbor right
}
if (isEdge(image,x,y,1)) { //check if new pixel is an edge
edges.push(new Link(x,y)) //add new pixel to edges[] (for calculating blob's radii later)
}
links.push(news[i]) //add this pixel to the new blob
news.splice(i,1) //remove this pixel from news[], as it's now checked
}
iteration++
}
if (links.length > 5) { //only add blob if it's size is somewhat significant
//console.log("...BLOB ADDED @ " + startX + "," + startY) //print blob's initial point
blobsFound.addBlob(1) //add an empty blob (constructor is not currently important)
blobsFound.blobs[blobsFound.blobs.length-1].links = links //fill blob's links[] array
blobsFound.blobs[blobsFound.blobs.length-1].edges = edges //fill blob's edges[] array
}
else {
//console.log("BLOB TOO SMALL")
}
}
}
}
//console.log("+: " + pixNums[0] + ", -: " + pixNums[1]) //not important
}
function isEdge(image, x, y, type) { //Edges used for finding the radii of a blob
var neighbors = 0
var color
if (x+skipSize < image.bitmap.width && y-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y-skipSize))
if (color.b == 255) {
neighbors++
}
}
if (x+skipSize < image.bitmap.width) {
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y))
if (color.b == 255) {
neighbors++
}
}
if (x+skipSize < image.bitmap.width && y+skipSize < image.bitmap.height) {
color = jimp.intToRGBA(image.getPixelColor(x+skipSize,y+skipSize))
if (color.b == 255) {
neighbors++
}
}
if (y+skipSize < image.bitmap.height) {
color = jimp.intToRGBA(image.getPixelColor(x,y+skipSize))
if (color.b == 255) {
neighbors++
}
}
if (x-skipSize > 0 && y+skipSize < image.bitmap.height) {
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y+skipSize))
if (color.b == 255) {
neighbors++
}
}
if (x-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y))
if (color.b == 255) {
neighbors++
}
}
if (x-skipSize >0 && y-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x-skipSize,y-skipSize))
if (color.b == 255) {
neighbors++
}
}
if (y-skipSize > 0) {
color = jimp.intToRGBA(image.getPixelColor(x,y-skipSize))
if (color.b == 255) {
neighbors++
}
}
if (neighbors > 1 && neighbors < 7) {
return true
}
else {
return false
}
}
function markBlobs(image) { //Show where the program found blobs
for (var i=0; i<blobsFound.blobs.length; i++) {
if (blobsFound.blobs[i].links.length > 5) {
var location = [blobsFound.blobs[i].aspects[0],blobsFound.blobs[i].aspects[1]]
image.setPixelColor(jimp.rgbaToInt(0,100,255,255),math.round(location[0]),math.round(location[1]))
for (var j=0; j<blobsFound.blobs[i].edges.length; j++) {
location = [blobsFound.blobs[i].edges[j].x,blobsFound.blobs[i].edges[j].y]
image.setPixelColor(jimp.rgbaToInt(0,255,0,255),location[0],location[1])
}
}
}
}
function analyzeBlobs() { //Calculate data of a blob
for (var i=0; i<blobsFound.blobs.length; i++) {
blobsFound.blobs[i].calculateCenterRadii()
if (blobsFound.blobs[i].aspects[7] == 1) {
blobsFound.blobs[i].calculateLinearityDirection()
}
else if (blobsFound.blobs[i].aspects[7] == 2) {
blobsFound.blobs[i].calculateCircularity()
}
}
}
function findLines() { //Use blob data to find most likely path
var Lnum = 0;
var bestLine = [2]
bestLine[0] = 0
bestLine[1] = 0
for (var i=0; i<blobsFound.blobs.length; i++) {
if (blobsFound.blobs[i].aspects[7] == 1 && blobsFound.blobs[i].links.length > 10) {
if (blobsFound.blobs[i].aspects[5] > bestLine[0]) {
bestLine[0] = blobsFound.blobs[i].aspects[5]
bestLine[1] = i
}
Lnum++
}
}
if (blobsFound.blobs.length > 0 && Lnum > 0) {
var lineHeading = blobsFound.blobs[bestLine[1]].aspects[6]
var angleDifference = math.abs((math.pi*1.5) - lineHeading)
if (angleDifference > math.pi) {
angleDifference = (2*math.pi) - angleDifference
}
if (angleDifference > 0.5*math.pi) {
lineHeading += math.pi
}
if (lineHeading > 2*math.pi) {
lineHeading -= 2*math.pi
}
var lineData = [blobsFound.blobs[bestLine[1]].aspects[0],blobsFound.blobs[bestLine[1]].aspects[1],lineHeading]
}
else {
var lineData = [-1,-1,-1]
}
return lineData
}
function findJunctions() { //Use blob data to find most likely junction
var Jnum = 0
var bestCircularity = [2] //circularity, blob#
bestCircularity[0] = 20
bestCircularity[1] = 0
var bestDensity = [2] //density, blob#
bestDensity[0] = 0
bestDensity[1] = 0
var bestBlob = 0
for (var i=0; i<blobsFound.blobs.length; i++) {
if (blobsFound.blobs[i].aspects[7] == 2 && blobsFound.blobs[i].links.length > 20) {
Jnum++
var circularity = blobsFound.blobs[i].aspects[3]
if (circularity < bestCircularity[0]) {
bestCircularity[0] = circularity
bestCircularity[1] = i
bestBlob = i
}
var density = blobsFound.blobs[i].aspects[4] //Not used right now...
}
}
if (blobsFound.blobs.length > 0 && Jnum > 0) {
var junctionData = [blobsFound.blobs[bestBlob].aspects[0],blobsFound.blobs[bestBlob].aspects[1],blobsFound.blobs[bestBlob].aspects[2]]
}
else {
var junctionData =[-1,-1,-1]
}
return junctionData
}
function BlobLibrary() {
this.blobs = []
}
BlobLibrary.prototype.addBlob = function(color) {
this.blobs = this.blobs.concat(new Blob(color))
}
function Blob(color) {
this.links = []
this.edges = []
this.radii = []
this.aspects = [8]
this.aspects[0] = 320 //X
this.aspects[1] = 200 //Y
this.aspects[2] = 50 //R adius
this.aspects[3] = 3 //C ircularity
this.aspects[4] = 5 //D ensity
this.aspects[5] = 0 //L inearity
this.aspects[6] = 0 //A ngle
this.aspects[7] = color //C olor (1=line,2=junction)
}
Blob.prototype.addLink = function(x, y) {
this.links = this.links.concat(new Link(x, y))
}
Blob.prototype.addEdge = function(x, y) {
this.edges = this.edges.concat(new Link(x, y))
}
Blob.prototype.calculateCenterRadii = function() {
var X = 0
var Y = 0
var edgeRadii = [this.edges.length]
for (var i=0; i<this.links.length; i++) {
X += this.links[i].x
Y += this.links[i].y
}
X /= this.links.length
Y /= this.links.length
this.aspects[0] = X
this.aspects[1] = Y
for (var i=0; i<this.edges.length; i++) {
var edgeRadius = math.sqrt(math.pow(this.edges[i].x - this.aspects[0],2) + math.pow(this.edges[i].y - this.aspects[1],2))
edgeRadii[i] = edgeRadius
}
this.radii = edgeRadii
if (this.radii.length > 0) {
var avgRadius = 0
for (var i=0; i<this.radii.length; i++) {
avgRadius += this.radii[i]
}
avgRadius /= this.radii.length
this.aspects[2] = avgRadius
}
}
Blob.prototype.calculateCircularity = function() {
if (this.radii.length > 0) {
var avgDifference = 0
for (var i=0; i<this.radii.length; i++) {
avgDifference += (this.radii[i] - this.aspects[2])
}
avgDifference /= this.radii.length
this.aspects[3] = avgDifference
}
this.aspects[4] = this.links.length / this.aspects[2]
}
Blob.prototype.calculateLinearityDirection = function() {
var shortest = 700
var longest = 0
var arrow = [1,1]
for (var i=0; i<this.radii.length; i++) {
var edgeRadius = this.radii[i]
if (edgeRadius < shortest) {
shortest = edgeRadius
}
if (edgeRadius > longest) {
longest = edgeRadius
arrow[0] = this.edges[i].x - this.aspects[0]
arrow[1] = this.edges[i].y - this.aspects[1]
}
}
var linearity = longest - shortest
this.aspects[5] = linearity
var angle = math.atan2(math.abs(arrow[1]), math.abs(arrow[0]))
if (arrow[0] < 0 && arrow[1] > 0) {
angle = math.pi - angle
}
else if (arrow[0] < 0 && arrow[1] < 0) {
angle = math.pi + angle
}
else if (arrow[0] > 0 && arrow[1] < 0) {
angle = (2*math.pi) - angle
}
this.aspects[6] = angle
}
function Link(x, y) {
this.x = x
this.y = y
}
Subscribe to:
Posts (Atom)