User:Mschmick: Difference between revisions
mNo edit summary |
No edit summary |
||
| Line 5: | Line 5: | ||
---- | ---- | ||
'''''I'm building a robot | '''''I'm building a robot called Birdhouse.''''' Its a smart launchpad for consumer drones like the Phantom or Inspire, letting you remotely shelter, charge and fly. It looks like a breadbox (shelter / charge), and opens to a flight deck (launch / fly / land). In this talk I'll discuss the software stack I'm running since this is my first robot and I'm probably doing it wrong. Anyway, this is '''''one''''' non-ROS approach and I'm open to suggestions! | ||
I call Birdhouse a "robot" since it ''does'' run an autonomous event loop (no human UI): | I call Birdhouse a "robot" since it ''does'' run an autonomous event loop (no human UI): | ||
| Line 13: | Line 13: | ||
# run actuators, etc to change state | # run actuators, etc to change state | ||
To minimize pilot loading, | ''To minimize pilot loading, this loop runs whenever props are spinning '''i.e. the pilot is piloting.''''' It's a worst-case scenario and consequently a best-case for system requirements. Consider the scenario of an outbound drone breaching a departure geofence, signaling Birdhouse to close: | ||
* stream LOS video (RasPi cam, mplayer) | * stream LOS (line-of-sight) video (RasPi cam, mplayer) | ||
* ingest drone telemetry @ | * ingest drone telemetry @50 Hz | ||
* | * downsample to Birdhouse @5 Hz (REST updates) | ||
* read sensors: light, weather, audio, video ( | * read sensors: light, weather, audio, video (ZeroMQ pub/sub) | ||
* logic & logging | * logic & logging (e.g. waypoint math for LOS aiming and geofence detection) | ||
* | * re-aim LOS camera at drone | ||
* run "close" sequence (5X motors, lights) | * run "close" sequence (5X motors, lights) | ||
* serve file from network share (smb) | * serve file from network share (smb) | ||
* serve web and REST requests (http) | * serve web and REST requests (http) | ||
The software architecture I'm proposing draws from a variety of OSS projects. It provides system services for general robotics like low-to-zero latency and high concurrency. On top of that I' | The software architecture I'm proposing draws from a variety of OSS projects. It provides system services for general robotics like low-to-zero latency and high concurrency. On top of that I'm implementing some abstractions particularly useful for dev-stage robotics: | ||
* HAL (hardware abstraction library) | * HAL (hardware abstraction library) | ||
* Virtual clock | * Virtual clock | ||
* RESTful hardware singleton | * RESTful hardware singleton | ||
Revision as of 11:47, 9 February 2018
Hi, Mark here. I'm building a robot for drones: http://www.flyingchair.co (prototype stage).
Here's a summary of the 5MoF talk I'm preparing for Thur 2/15. If you have any suggestions or concentrations please let me know.
I'm building a robot called Birdhouse. Its a smart launchpad for consumer drones like the Phantom or Inspire, letting you remotely shelter, charge and fly. It looks like a breadbox (shelter / charge), and opens to a flight deck (launch / fly / land). In this talk I'll discuss the software stack I'm running since this is my first robot and I'm probably doing it wrong. Anyway, this is one non-ROS approach and I'm open to suggestions!
I call Birdhouse a "robot" since it does run an autonomous event loop (no human UI):
- gather data from sensors and telemetry
- apply realtime logic
- run actuators, etc to change state
To minimize pilot loading, this loop runs whenever props are spinning i.e. the pilot is piloting. It's a worst-case scenario and consequently a best-case for system requirements. Consider the scenario of an outbound drone breaching a departure geofence, signaling Birdhouse to close:
- stream LOS (line-of-sight) video (RasPi cam, mplayer)
- ingest drone telemetry @50 Hz
- downsample to Birdhouse @5 Hz (REST updates)
- read sensors: light, weather, audio, video (ZeroMQ pub/sub)
- logic & logging (e.g. waypoint math for LOS aiming and geofence detection)
- re-aim LOS camera at drone
- run "close" sequence (5X motors, lights)
- serve file from network share (smb)
- serve web and REST requests (http)
The software architecture I'm proposing draws from a variety of OSS projects. It provides system services for general robotics like low-to-zero latency and high concurrency. On top of that I'm implementing some abstractions particularly useful for dev-stage robotics:
- HAL (hardware abstraction library)
- Virtual clock
- RESTful hardware singleton