User:Mschmick: Difference between revisions
Created page with "Hi, Mark here. I'm building a robot for drones: http://www.flyingchair.co (prototype stage). Here's a summary of the 5MoF talk I'm preparing for Thur 2/15. If you have any su..." |
m Undo revision 64908 by DetailOriented (talk) |
||
| (9 intermediate revisions by 2 users not shown) | |||
| Line 1: | Line 1: | ||
Hi, Mark here. I'm building a robot for drones: http://www.flyingchair.co (prototype stage). | Hi, Mark here. I'm building a robot for drones: http://www.flyingchair.co (prototype stage). Email me [mailto:mark@flyingchair.co here]. | ||
Here's a summary of the 5MoF talk I'm | Here's a summary of the '''RESTful Robots''' 5MoF talk I'm giving on Thur 2/15. If you have any suggestions or concentrations please let me know. | ||
---- | ---- | ||
'''''I'm building a robot | '''''I'm building a robot called Birdhouse,''''' a remote telepresence launchpad for consumer drones like the Phantom. It looks like a breadbox (shelter / charge), and opens to a flight deck (launch / fly / land). In this talk I'll discuss the software stack I'm running since this is my first robot and I'm probably doing it wrong. Anyway, here's '''''one''''' non-ROS approach and I'm open to suggestions! | ||
I call Birdhouse a "robot" since it does run an autonomous event loop (no human UI): | While it doesn't walk or fly, I call Birdhouse a "robot" since it '''''does''''' run an autonomous event loop (no human UI): | ||
# gather data from sensors and telemetry | # gather data from sensors and telemetry | ||
| Line 13: | Line 13: | ||
# run actuators, etc to change state | # run actuators, etc to change state | ||
To minimize pilot loading, | ''To minimize pilot loading, this loop runs whenever props are spinning '''i.e. the pilot is piloting.''''' It's a worst-case scenario and consequently a best-case for system requirements. Consider the scenario of an outbound drone breaching a departure geofence, signaling Birdhouse to close: | ||
* stream LOS video (RasPi cam, mplayer) | * stream LOS (line-of-sight) video (RasPi cam, mplayer) | ||
* ingest drone telemetry @ | * ingest drone telemetry @50 Hz | ||
* | * downsample to Birdhouse @5 Hz (REST updates) | ||
* read sensors: light, weather, audio, video ( | * read sensors: light, weather, audio, video (ZeroMQ pub/sub) | ||
* logic & logging | * logic & logging (e.g. waypoint math for LOS aiming and geofence detection) | ||
* | * re-aim LOS camera at drone | ||
* run "close" sequence (5X motors, lights) | * run "close" sequence (5X motors, lights) | ||
* serve file from network share (smb) | * serve file from network share (smb) | ||
* serve web and REST requests (http) | * serve web and REST requests (http) | ||
The software architecture I'm proposing draws from a | The software architecture I'm proposing draws from a number of OSS projects. It provides system services for general robotics like low-to-zero latency and high concurrency. On top of that I'm implementing some abstractions for dev-stage robotics: | ||
* HAL (hardware abstraction library) | * HAL (hardware abstraction library) | ||
* Virtual clock | * Virtual clock | ||
* RESTful hardware singleton | * RESTful hardware singleton | ||
References: Python, [https://www.djangoproject.com Django], [http://www.django-rest-framework.org Django Rest Framework] (DRF), [http://django-q.readthedocs.io/en/latest/ Django Q] / [https://channels.readthedocs.io/en/latest/ Channels], [http://zeromq.org ZeroMQ], [https://www.nginx.com/resources/wiki/ NGINX], [http://uwsgi-docs.readthedocs.io/en/latest/ uWSGI] | |||
Latest revision as of 22:27, 12 February 2018
Hi, Mark here. I'm building a robot for drones: http://www.flyingchair.co (prototype stage). Email me here.
Here's a summary of the RESTful Robots 5MoF talk I'm giving on Thur 2/15. If you have any suggestions or concentrations please let me know.
I'm building a robot called Birdhouse, a remote telepresence launchpad for consumer drones like the Phantom. It looks like a breadbox (shelter / charge), and opens to a flight deck (launch / fly / land). In this talk I'll discuss the software stack I'm running since this is my first robot and I'm probably doing it wrong. Anyway, here's one non-ROS approach and I'm open to suggestions!
While it doesn't walk or fly, I call Birdhouse a "robot" since it does run an autonomous event loop (no human UI):
- gather data from sensors and telemetry
- apply realtime logic
- run actuators, etc to change state
To minimize pilot loading, this loop runs whenever props are spinning i.e. the pilot is piloting. It's a worst-case scenario and consequently a best-case for system requirements. Consider the scenario of an outbound drone breaching a departure geofence, signaling Birdhouse to close:
- stream LOS (line-of-sight) video (RasPi cam, mplayer)
- ingest drone telemetry @50 Hz
- downsample to Birdhouse @5 Hz (REST updates)
- read sensors: light, weather, audio, video (ZeroMQ pub/sub)
- logic & logging (e.g. waypoint math for LOS aiming and geofence detection)
- re-aim LOS camera at drone
- run "close" sequence (5X motors, lights)
- serve file from network share (smb)
- serve web and REST requests (http)
The software architecture I'm proposing draws from a number of OSS projects. It provides system services for general robotics like low-to-zero latency and high concurrency. On top of that I'm implementing some abstractions for dev-stage robotics:
- HAL (hardware abstraction library)
- Virtual clock
- RESTful hardware singleton
References: Python, Django, Django Rest Framework (DRF), Django Q / Channels, ZeroMQ, NGINX, uWSGI