Software apps and online services
In our modern world, there is one mystery about young parent's life. When your kid is happy and quiet you see pacifiers everywhere and there's always one in your pocket, or beside you, but...
...when your child starts yelling at you and its screams are louder then T-Rex whining, you will most likely NEVER have one near you and all of a sudden somehow they all are disappeared.
I am not kidding - every parent can confirm this strange and sometimes mysterious disappearence of your babyies beloved external thumb. Some day there will be a Netflix Original Documentary about this phenomenon - you'll see.Marvin and I - one vision for our pacifier (ro)bot
... so Marvin (my 21 month old son) decided, that his dad (Adam, that's me) should build a robot, that can collect and store all his pacifiers. Long story short:
Let's build Elvis, your personal pacifier bot controlled by Voice via Alexa Echo Dot. You will never have to search for your lost pacifiers again!
Part 1 - Prototyping Elvis
After the first idea, we had to find the a solution to arrange and store some pacifiers - the core function of the pacifier bot.
It should be something rotatable and easy to release the single pacifier. As its not possible to use to many motor-driven parts, I decided to find a machanical solution with a rotating and a releasing part. LEGO Power Miners 'Kristallsammler' Nr 8961 used this orange wheele with 6 crawlers.
This is the base for the storing wheel:
So we are ready to spin the wheel and also need a solid base for this mechanism. It should be a solid base with enough space for the pacifiers to pass by. All other parts were collectet from various Lego Technic an Mego Mindstorms EV3 Sets.
You should start by a minimum amout of python code to with eg a MediumMotor on OUTPUT_A
from ev3dev2.motor import MediumMotor, OUTPUT_A, SpeedPercent
To find the right rotating speed you can use the SpeedPercent class like this:
GADGET_SPEED = 35
rotatenumber = 1
gadget_rotate = MediumMotor(OUTPUT_A)
Experiment with different values as you might use different sized gears and ratios. In my evaluation the speed of 35% was just right to not lose the pacifiers during rotation. The amount of rotations also depends on your gear ratio. I used 1/3 - so all rotation numbers must be multiplied or divided to find the correct rotation position.
In the end of the fist stage you should have a solid rotation base, that can be triggerd by a single stand alone .py PYTHON3 script via the VisualStudioCode EV3 Plugin, SSH or manual startup via Brickman.
To this point you don't need any Bluetooth or Alexa Skill functionalitiy. You can use the Stand Alone PYTHON (Version 0.2 Stand Alone) to try out everything without the complexity of the AlexaGadget Part. This parts comes now. Before you start make sure to experiment with Mission 1 - 4 from the Alexa Voice Challenge tutorials - good luck!Part 2 - Coding Elvis functionality
Now comes the intermediate hard part: The PYTHON coding. You can see the real time coding in this video:
(but really... better don't watch it! -
its epically long, has no sound and shows my low knowledge of python structures)
To keep it simple and short, here are the main 12 parts as you can find them in the attached .py file(s)
We will start by collecting all functions to build the required interactions:
# LED Status
def elvisStatus(self, tocolor):
print("Elivs LED Status Change")
This will create some LED colors on the bot to show the program status:
# COLORSENSOR Feedback
# (not used at this point)
print("Elivs ColorSensor Status Started")
# return self.gadget_sensor.value()
ColorSensor values will return via this function, but is not used at this point.
# MOTOR Feedback
# PARAMS: rotatenumber - gives int for amount of EV Motor rotations
def elvisSchnullerRotate(self, rotatenumber):
print("Elivs Motor Started")
Rotating the pacifier wheel with given rotation number.
# Calibrates Colors - finds the ~white~ (COLOR_WHITE = 6) brick on wheel, stops and rotates 1/3 to the next position
def elvisCalibrate(self, calibratecolor):
self.gadget_rotate.on( SpeedPercent(GADGET_SPEED/3) )
while self.gadget_sensor.value() != calibratecolor:
# TODO Calibrated Positon Savepoint calibrateted_position = gadget_rotate.position.value()
# TODO Custom event to ALEXA with calibration completed
Calibration function to find the colored markers.
# Reveals parciviers - activates Elvis' tongue to spit out latest pacifier
self.gadget_spit.on_for_rotations( SpeedPercent(GADGET_SPEED),3 )
Activating the release mechanism of the bot.
# TouchSensor State
pressed = bool( self.gadget_tongue.is_pressed )
Checking the status of the release mechanism. If the TouchSensor is pressed or not.
# calibrates tongue
if self.elvisTongue() != True:
while self.elvisTongue() == False:
Calibrating the release machanism to make sure it doesn't collide with the rotating wheel.
And finally the __init__ routine for the alexa gadget, described in Part 4.
In short words:
# Now The Tricky Part. Connecting to Amazon Echo Dot as Bluetoot Gadget
self.gadget_leds = Leds()
self.gadget_sensor = ColorSensor()
self.gadget_sensor.mode = 'COL-COLOR'
self.gadget_rotate = MediumMotor(OUTPUT_A)
self.gadget_spit = LargeMotor(OUTPUT_B)
self.gadget_tongue = TouchSensor()
self.gadget_tongue.mode = 'TOUCH'
Connecting to the Echo Dot with the __init__(self): method described in the mission tutorials.
if __name__ == "__main__":
elvisgadget = ElvisSchnullerbot()
# --> THIS main() does the magic
The __main__: method inits all the AlexaGadget Class, which is referd to as ElivsSchnullerbot , runs a startup-routine and calls the .main() method, which will star the connectivity to the configured Echo Dot
Find the full source (Version 1.0.1) in the attached Files.Part 3 - Conversational design
The conversational design started with a paper prototype and some real dialogues with family and friends, to find out the right conversation flows:
This approach saves you some headaches during skill implemention (at least, if you design your first Alexa Skill). The above handwriting assumes a very bright gadget functionality, that would require more development on both software and hardware backend. To make it approachable we followed the MVP concept, that is implemented in Version 1.0.
A good idea is to cluster the things you say:
ON THE LEFT < what you say
IN THE MIDDLE < what Elvis would say
ON THE RIGHT < what the EV3 Python should do
This helped me a lot!
Follow some Alexa skill Tutorials and find your flow. Mine was:
What Elvis says at startup:
.reprompt(`<voice name="Hans">Ich kann Schnuller aufbewahren oder den Schnuller deiner Lieblingsfarbe ausspucken. Was möchtest du?</voice>`)
What Elivs says during calibration:
.speak(`<voice name="Hans">Es macht Sinn das Schnullerrad gelegentlich zu kalibrieren. Auf geht's!</voice>`)
What Elvis says during pacifier release:
.speak(`<voice name="Hans">Schnuller wird ausgespuckt.</voice>`)
All other interactive conversations will follow soon. And by the way: The Elvis Pacifier Skill uses the male "Alexa Amazon Polly voice" called "Hans". This gives some groove ;-)Part 4 - Alexa skill design
The implementation of the skill was more focused on the interaction part with the robot, so it is still a work in progress. You can get a glimpse on the full JS Code, which is based on mission 3 of the Tutorial.
You can use quiet a lot of the gadgets functionality by hacking the values directliy to the directives e.g.
JS directive sent from the skill
let directive = Util.build(endpointId, NAMESPACE, NAME_CONTROL,
PYTHON Interpretation on the gadget:
payload = json.loads(directive.payload.decode("utf-8"))
control_type = payload["type"]
if control_type == "calibrate":
if payload["farbcode"] == 2:
elif payload["farbcode"] == 4:
The stand alone part in German is heavy beta and will not (yet) work with version 0.1. in the project. I used version 0.2 that can be find in the attached files.Part 5 - User-testing and troubleshooting
Marvin did it's best to challenge the Lego Mindstorm construction. It's not yet bulletproof, but it can withstand a 21 month old, after some building optimization.
I had several issues with starting the scripts from VSC EV3DEV plugin or directly from the brickman. Everything was correct and I finally decided to upgrade the kernel EV3DEV to the latest image ev3dev-stretch-ev3-generic-2019-10-23 and before that I checked:
- correct Shebang: #!/usr/bin/env python3
- correct Linebrake: LF not CRLF
- sequence of .py imports: from ev3dev2 ... first, then from agt
- ... I really did. nothing helped - the new October Image just worked fine(!)
The last SD Image was ev3dev-stretch-ev3-generic-2019-03-03 :-/ Via SSH and sudo python3 ./script.py and the correct password it worked, but that's not the correct solution.
Motor speed for the middlemotor is best set to max 35%. In the project I used:
GADGET_SPEED = 35
gadget_rotate.on( SpeedPercent(GADGET_SPEED) )
For calibration and positioning purposes even a fraction of /2 or even /3 worked best:
gadget_rotate.on( SpeedPercent(GADGET_SPEED/3) )
Bluetooth is not always my friend! Thanks to the unpair BASH Script in the contest's ZIP package:
You can unpair your Amazon Echo from the EV3! Unpairing the other part of the bluetooth connection in the ALEXA App is written multiple times in the mission tutorials, but it took me hours to find the correct place to unpair the saved bluetooth devices in the App.
Here is a shortcut that will save you some time:
- Open the Alexa app📷.
- Select Devices 📷.
- Select Echo & Alexa.
- Select your device. (!!! this is the part I missed !!!)
- Select Bluetooth Devices.
- Select the device you want to remove, and then select Forget Device. Repeat this step for each device you want to remove.
With everything in place you we can ask Alexa for Elvis' Assistance:
'Alexa, frag' Elvis nach den Schnullern!' (Alexa, ask Elvis)
As one final tweak we switch Alexa's female voice to a bullet-proof Elvis like quality: Hans. With the help of Amazon Polly Voices (https://aws.amazon.com/de/polly/) and some SSML-Tags (which took me quiet a big amount of my lifetime, to find the correct place in the JSON an JS files) the Voice Output of Elvis changed to a solid male character.
<voice name="Hans">#THINGS ELVIS HAS TO SAY</voice>
And even more Elvis like with prosody pitch switched to low
`<voice name="Hans"><prosody pitch="low">#EVEN LOWER VOICE FOR THINGS ELVIS HAS TO SAY</voice>`
See SSML Reference for more tweaks. Hans is the only german speaking guy in this list ;-)Part 7 - Future ideas
One of my first ideas was to build a *searching* Robot for pacifiers lying in the apartment. I started some Image Recognition and Classifing Trainig Sets for all of Marvin's pacifiers. After evaluating different cameras for the EV3 on EV3DEV, I quickly discarded this part, due to the fact, that after finding a recognized pacifier (which should be handable with some ML, externalized calculation and some more prototyping nights) there is still a long way to go:
- navigate Elvis to the pacifier
- position Elvis in the correct spot to the pacifier
- safty grab the pacifier
- (somehow) hand over the pacifier to a storage
- collect the pacifier in the storage
Thank you for reading my project story @LEGOMINDSTORMSVoiceChallenge
Please share your feedback, if you liked this project. I would like to read your thoughts on this. Feel free to ask me questions in the comments section.
... to be continued :-)