User:Mschmick: Difference between revisions

From Noisebridge
Jump to navigation Jump to search
Mschmick (talk | contribs)
Created page with "Hi, Mark here. I'm building a robot for drones: http://www.flyingchair.co (prototype stage). Here's a summary of the 5MoF talk I'm preparing for Thur 2/15. If you have any su..."
 
Mschmick (talk | contribs)
m Undo revision 64908 by DetailOriented (talk)
 
(9 intermediate revisions by 2 users not shown)
Line 1: Line 1:
Hi, Mark here. I'm building a robot for drones: http://www.flyingchair.co (prototype stage).
Hi, Mark here. I'm building a robot for drones: http://www.flyingchair.co (prototype stage). Email me [mailto:mark@flyingchair.co here].


Here's a summary of the 5MoF talk I'm preparing for Thur 2/15. If you have any suggestions or concentrations please let me know.
Here's a summary of the '''RESTful Robots''' 5MoF talk I'm giving on Thur 2/15. If you have any suggestions or concentrations please let me know.


----
----


'''''I'm building a robot for drones called Birdhouse.''''' Its a smart launchpad for commodity models like the Phantom or Inspire, letting you remotely shelter, charge and fly. It looks like a breadbox (shelter/charge), and opens to a flight deck (launch/fly/land). In this talk I'd like to detail the software stack I'm running since this is my first robot and I'm probably doing it wrong. Anyway, this is the non-ROS way I'm doing it but I'm open to suggestions.
'''''I'm building a robot called Birdhouse,''''' a remote telepresence launchpad for consumer drones like the Phantom. It looks like a breadbox (shelter / charge), and opens to a flight deck (launch / fly / land). In this talk I'll discuss the software stack I'm running since this is my first robot and I'm probably doing it wrong. Anyway, here's '''''one''''' non-ROS approach and I'm open to suggestions!


I call Birdhouse a "robot" since it does run an autonomous event loop (no human UI):
While it doesn't walk or fly, I call Birdhouse a "robot" since it '''''does''''' run an autonomous event loop (no human UI):


# gather data from sensors and telemetry
# gather data from sensors and telemetry
Line 13: Line 13:
# run actuators, etc to change state
# run actuators, etc to change state


To minimize pilot loading, the loop runs whenever props are spinning i.e. the pilot is piloting. It's a worst-case scenario and consequently a good basis for system requirements. Consider the case below of an outbound drone that breaches a departure geofence, closing Birdhouse until you return.
''To minimize pilot loading, this loop runs whenever props are spinning '''i.e. the pilot is piloting.''''' It's a worst-case scenario and consequently a best-case for system requirements. Consider the scenario of an outbound drone breaching a departure geofence, signaling Birdhouse to close:


* stream LOS video (RasPi cam, mplayer)
* stream LOS (line-of-sight) video (RasPi cam, mplayer)
* ingest drone telemetry @50Hz
* ingest drone telemetry @50 Hz
* down-sample to Birdhouse @5Hz (REST updates)
* downsample to Birdhouse @5 Hz (REST updates)
* read sensors: light, weather, audio, video (0MQ pub/sub)
* read sensors: light, weather, audio, video (ZeroMQ pub/sub)
* logic & logging
* logic & logging (e.g. waypoint math for LOS aiming and geofence detection)
* track LOS camera on drone
* re-aim LOS camera at drone
* run "close" sequence (5X motors, lights)
* run "close" sequence (5X motors, lights)
* serve file from network share (smb)
* serve file from network share (smb)
* serve web and REST requests (http)
* serve web and REST requests (http)


The software architecture I'm proposing draws from a variety of OSS projects. It provides system services for general robotics like low-to-zero latency and high concurrency. On top of that I've implemented some abstractions that are particularly useful for dev-stage robotics:
The software architecture I'm proposing draws from a number of OSS projects. It provides system services for general robotics like low-to-zero latency and high concurrency. On top of that I'm implementing some abstractions for dev-stage robotics:


* HAL (hardware abstraction library)
* HAL (hardware abstraction library)
* Virtual clock
* Virtual clock
* RESTful hardware singleton
* RESTful hardware singleton
References: Python, [https://www.djangoproject.com Django], [http://www.django-rest-framework.org Django Rest Framework] (DRF), [http://django-q.readthedocs.io/en/latest/ Django Q] / [https://channels.readthedocs.io/en/latest/ Channels], [http://zeromq.org ZeroMQ], [https://www.nginx.com/resources/wiki/ NGINX], [http://uwsgi-docs.readthedocs.io/en/latest/ uWSGI]

Latest revision as of 22:27, 12 February 2018

Hi, Mark here. I'm building a robot for drones: http://www.flyingchair.co (prototype stage). Email me here.

Here's a summary of the RESTful Robots 5MoF talk I'm giving on Thur 2/15. If you have any suggestions or concentrations please let me know.


I'm building a robot called Birdhouse, a remote telepresence launchpad for consumer drones like the Phantom. It looks like a breadbox (shelter / charge), and opens to a flight deck (launch / fly / land). In this talk I'll discuss the software stack I'm running since this is my first robot and I'm probably doing it wrong. Anyway, here's one non-ROS approach and I'm open to suggestions!

While it doesn't walk or fly, I call Birdhouse a "robot" since it does run an autonomous event loop (no human UI):

  1. gather data from sensors and telemetry
  2. apply realtime logic
  3. run actuators, etc to change state

To minimize pilot loading, this loop runs whenever props are spinning i.e. the pilot is piloting. It's a worst-case scenario and consequently a best-case for system requirements. Consider the scenario of an outbound drone breaching a departure geofence, signaling Birdhouse to close:

  • stream LOS (line-of-sight) video (RasPi cam, mplayer)
  • ingest drone telemetry @50 Hz
  • downsample to Birdhouse @5 Hz (REST updates)
  • read sensors: light, weather, audio, video (ZeroMQ pub/sub)
  • logic & logging (e.g. waypoint math for LOS aiming and geofence detection)
  • re-aim LOS camera at drone
  • run "close" sequence (5X motors, lights)
  • serve file from network share (smb)
  • serve web and REST requests (http)

The software architecture I'm proposing draws from a number of OSS projects. It provides system services for general robotics like low-to-zero latency and high concurrency. On top of that I'm implementing some abstractions for dev-stage robotics:

  • HAL (hardware abstraction library)
  • Virtual clock
  • RESTful hardware singleton

References: Python, Django, Django Rest Framework (DRF), Django Q / Channels, ZeroMQ, NGINX, uWSGI