My Seeed Studio Respeaker 4-Mic Array arrived during the 2020 Covid-19 Lockdown, which pretty much guaranteed it some immediate attention.
The Seeed Studio link has some good instructions, and it was smooth sailing until the section "Alexa/Baidu/Snowboy SDK". I have Google Home Mini smart speakers and so was loath to register for Alexa. Baidu seemed like a good alternative, for in 2016 Baidu published a stunning paper on Deep Speech on using deep learning on speech to text.
Getting Baidu authorization keys, however proved way too slow so I took a quick look at Mozilla's implementation of Deep Speech, using Google's Tensorflow.
The purpose of all this (besides having some fun) is to see if I can voice-control my IoT devices without an Internet link. Also the added security and privacy seems worthwhile. And it is not like I'm going anywhere for a few days.
At this point Mycroft looks tempting, and since the instructions are straightforward, I downloaded the image file. There is a typo in the image write to sdcard; just replace /dev/sdb1 with /dev/sdb:
sudo dd if=path-to-your-image.img of=/dev/sdb bs=20M
Per the instructions, you will have to register with their website so keep a note of your registration
code on the screen. Keep following until the section "Selecting audio output and audio input".
Respeaker is not listed in the microphones' list, But Dimitry Maslov comes to the rescue:
sudo apt-get update
sudo apt-get upgrade
git clone https://github.com/respeaker/seeed-voicecard.git
cd /home/pi/seeed-voicecard
./install.sh 4mic
Next, go back to Mycroft: a quick and clean way is to reboot. Reconfigure it again with
mycroft-setup-wizard
And select 'Other'. Mycroft should now work. Here's a video of mine.
Mycroft seems to run a lot slower than Google Assistant. This is because it also uploads the audio to cloud servers and Mycroft servers probably have a lot less oomph.
Next we want Mycroft to turn on an IoT lamp. We can used a few services for this, for example, Adafruit but for simplicity we can use an esp8266 1-channel relay and a webhook. We use 'mycroft-msk create' and fill in the questionnaire:
(.venv) pi@picroft:~ $ mycroft-msk create
Enter a short unique skill name (ie. "siren alarm" or "pizza orderer"): soldering station lamp
Class name: SolderingStationLampSkill
Repo name: soldering-station-lamp-skill
Looks good? (Y/n) n
Enter a short unique skill name (ie. "siren alarm" or "pizza orderer"): soldering station lamp on
Class name: SolderingStationLampOnSkill
Repo name: soldering-station-lamp-on-skill
Looks good? (Y/n) y
Enter some example phrases to trigger your skill:
- Soldering station lamp on
- Soldering station light on
- Turn on the soldering station lamp
- Turn on the soldering station light
-
Enter what your skill should say to respond:
- The soldering station light is now on
- Turning on the soldering station light
-
Enter a one line description for your skill (ie. Orders fresh pizzas from the store):
- Turns on the light on the soldering station
Enter a long description:
> Turns on the light on the soldering station
>
$cat __init__.py
from mycroft import MycroftSkill, intent_file_handler
import subprocess
class SolderingStationLampOn(MycroftSkill):
def __init__(self):
MycroftSkill.__init__(self)
@intent_file_handler('off.lamp.station.soldering.intent')
def handle_off_lamp_station_soldering(self, message):
cmd = "curl -k http://ww.xx.yy.zz:8080/1/on"
answer = ""
try:
answer = subprocess.check_output(cmd, shell=True)
except:
print(str(answer))
print(str(answer))
self.speak_dialog('off.lamp.station.soldering')
def create_skill():
return SolderingStationLampOn()
Here's a video of the result. You will notice there is another skill to turn off the lamp.
But what I really wanted was Mycroft to function offline. There is some talk of a "Personal Server" version, but as this forum shows, there is a lot of code by the redoubtable JarbasAI, but is not quite ready yet.
So, it is back to my Respeaker and DeepSpeech image: we will look at Jasper in Part 2.
Happy Trails
Mycroft/Picroft on Raspberry Pi 4 and Respeaker 4-mic Array |
Mycroft seems to run a lot slower than Google Assistant. This is because it also uploads the audio to cloud servers and Mycroft servers probably have a lot less oomph.
Next we want Mycroft to turn on an IoT lamp. We can used a few services for this, for example, Adafruit but for simplicity we can use an esp8266 1-channel relay and a webhook. We use 'mycroft-msk create' and fill in the questionnaire:
(.venv) pi@picroft:~ $ mycroft-msk create
Enter a short unique skill name (ie. "siren alarm" or "pizza orderer"): soldering station lamp
Class name: SolderingStationLampSkill
Repo name: soldering-station-lamp-skill
Looks good? (Y/n) n
Enter a short unique skill name (ie. "siren alarm" or "pizza orderer"): soldering station lamp on
Class name: SolderingStationLampOnSkill
Repo name: soldering-station-lamp-on-skill
Looks good? (Y/n) y
Enter some example phrases to trigger your skill:
- Soldering station lamp on
- Soldering station light on
- Turn on the soldering station lamp
- Turn on the soldering station light
-
Enter what your skill should say to respond:
- The soldering station light is now on
- Turning on the soldering station light
-
Enter a one line description for your skill (ie. Orders fresh pizzas from the store):
- Turns on the light on the soldering station
Enter a long description:
> Turns on the light on the soldering station
>
Enter author: cmheong
Go to Font Awesome (fontawesome.com/cheatsheet) and choose an icon.
Enter the name of the icon: lightbulb
Pick a color for your icon. Find a color that matches the color scheme at mycroft.ai/colors, or pick a color at: color-hex.com.
Enter the color hex code (including the #): #fff68f
Categories define where the skill will display in the Marketplace. It must be one of the following:
Daily, Configuration, Entertainment, Information, IoT, Music & Audio, Media, Productivity, Transport.
Enter the primary category for your skill:
- IoT
Enter additional categories (optional):
-
Enter tags to make it easier to search for your skill (optional):
- IoT
- Smart Home
- Home Assistant
-
For uploading a skill a license is required.
Choose one of the licenses listed below or add one later.
1: Apache v2.0
2: GPL v3.0
3: MIT
Choose license above or press Enter to skip? 3
Some of these require that you insert the project name and/or author's name. Please check the license file and add the appropriate information.
Does this Skill depend on Python Packages (PyPI), System Packages (apt-get/others), or other skills?
This will create a manifest.yml file for you to define the dependencies for your
Skill.
Check the Mycroft documentation at mycroft.ai/to/skill-dependencies to learn more about including dependencies, and the manifest.yml file, in Skills. (y/N) y
Would you like to create a GitHub repo for it? (Y/n) y
Enumerating objects: 12, done.
Counting objects: 100% (12/12), done.
Delta compression using up to 4 threads
Compressing objects: 100% (10/10), done.
Writing objects: 100% (12/12), 2.52 KiB | 322.00 KiB/s, done.
Total 12 (delta 0), reused 0 (delta 0)
To https://github.com/cmheong/soldering-station-lamp-on-skill
* [new branch] master -> master
Branch 'master' set up to track remote branch 'master' from 'origin'.
Created GitHub repo: https://github.com/cmheong/soldering-station-lamp-on-skill
Created skill at: /opt/mycroft/skills/soldering-station-lamp-on-skill
And that is all there is to it. Don't worry about uploading to github - it is optional. You will now get a whole bunch of smallish files:
(.venv) pi@picroft:~ $ ls -l /opt/mycroft/skills/soldering-station-lamp-on-skill
/
total 32
-rw-r--r-- 1 pi pi 393 May 14 15:15 __init__.py
-rw-r--r-- 1 pi pi 1058 May 14 15:20 LICENSE.md
drwxr-xr-x 3 pi pi 4096 May 14 15:13 locale
-rw-r--r-- 1 pi pi 1009 May 14 15:20 manifest.yml
drwxr-xr-x 2 pi pi 4096 May 14 15:15 __pycache__
-rw-r--r-- 1 pi pi 531 May 14 15:17 README.md
-rw-r--r-- 1 pi pi 35 May 14 15:17 settings.json
-rw-r--r-- 1 pi pi 631 May 14 15:20 settingsmeta.yaml
The file we are interested in is __init__.py. Since this is a toy example to get you going, we are going to use the crudest possible and most insecure method, using a bash shell to launch our webhook. The modified file is in my github repository, but it is so small I'll also list it here:.
$cat __init__.py
from mycroft import MycroftSkill, intent_file_handler
import subprocess
class SolderingStationLampOn(MycroftSkill):
def __init__(self):
MycroftSkill.__init__(self)
@intent_file_handler('off.lamp.station.soldering.intent')
def handle_off_lamp_station_soldering(self, message):
cmd = "curl -k http://ww.xx.yy.zz:8080/1/on"
answer = ""
try:
answer = subprocess.check_output(cmd, shell=True)
except:
print(str(answer))
print(str(answer))
self.speak_dialog('off.lamp.station.soldering')
def create_skill():
return SolderingStationLampOn()
Here's a video of the result. You will notice there is another skill to turn off the lamp.
Mycroft with IoT Skil |
So, it is back to my Respeaker and DeepSpeech image: we will look at Jasper in Part 2.
Happy Trails