Mycroft* is a software platform designed to bring vocal computing to everyone. This friendly looking personal assistant is designed to allow customers to personalize it as needed. Built on Python* technology, an open source language, even novices can build a skill for Mycroft. Intel® Software Innovator Joshua Montgomery, Karl Fezer, and Steve Penrod of the Mycroft team let me pick their brains to learn a bit more about Mycroft.
Tell us about Mycroft.
Mycroft is the open source alternative to Siri or Alexa. It is a complete personal assistant designed to allow enterprise customers to white label, customize and extend it. We started the project by building a reference device for makers. It has a high quality mic and speaker and looks a little bit like E.T. or WALL-E or Big Hero 6.
But Mycroft isn't a hardware company, in fact we've open sourced our hardware design and are encouraging others to use the technology. Mycroft is a software platform designed to bring vocal computing to everyone.
What kind of technology is used?
Mycroft makes use of deep learning, natural language processing, graphing databases and the latest and greatest in distributed computing technology. We also use easily accessible technology, specifically Python. We are working to make sure that even novice coders can build a skill for Mycroft.
The Mycroft framework is designed to be portable and accessible for novice programmers, yet powerful enough for experienced programmers to achieve anything they can imagine. Python meets both these requirements as a language, plus it enjoys the strong support of educational resources and development tools.
Mycroft is accessible on a wide range of equipment. Linux is the native environment enabling it to run on embedded and Internet of Things (IoT) systems built on single board computers like the Raspberry Pi*, small form factor computers like an Intel® NUC, or full desktops running rich graphical user interface (GUI) environments like KDE* Neon.
Powerful natural language processing is made simple via the rules-based Adapt* intent parser. It includes a context system which augments the literal spoken word with additional context, even non-verbal context. The default text to speech mechanism, Mimic*, is lightweight and portable C. Even disconnected systems can employ natural verbal feedback using it.
All this can be customized with the skill system. Anything possible or accessible from Python can be controlled by modules which the user selects based on their needs. System integrators can expose custom sensors and hardware capabilities with a similar plug-in mechanism.
Finally, this robust framework is actively being extended. Support is in place for multiple spoken languages. Machine learning is being applied for example-based intent parsing. The modular design is allowing novel approaches with speech to text in different languages, such as research using Recurrent Neutral Networks for Indian dialects.
How did you come up with the name Mycroft?
Mycroft was the hero of Robert Heinlein's 1963 novel "The Moon is a Harsh Mistress". In the book an Artificial Intelligence (AI) supercomputer helps to free a nation. We were inspired by this vision of AI as a technology that can be used for good, so we named our technology Mycroft.
Why was it important to make the device anthropomorphic?
We were inspired by Guy Hoffman who has built some amazing robots that collaborate with humans to make music and create art. We felt that the technology should be friendly. It should be relatable. It should reflect the best of humanity.
What technical challenges have you had to overcome in developing Mycroft?
You name it, we've encountered it. Wake word accuracy, WiFi setup, natural language processing approaches, voice creation....all of these problems are difficult. To make Echo, Amazon acquired speech transcription company Yap* and text-to-speech company Ivona*, then spent years and tens of millions of dollars building the product. Thanks to our open community, we've reached the same place in 18 months with less than a million dollars.
Tell us about how you’ve used crowdfunding for the development of Mycroft.
We were working on Mycroft by ourselves when we realized that the speaker we were building might be something that other people would buy. We took the idea to Kickstarter* and discovered that there were thousands of people out there looking for an open solution.
Crowdfunding has been instrumental in our success. It is our community of developers and investors who are supporting the effort until we get enough enterprise customers to pay our own way. We are thankful and humbled at the level of support we've received from Techstars*, 500 Startups*, Crowdfunder*, Indiegogo* and Kickstarter.
For our latest round of funding, Rob Ness will be setting up an AngelList syndicate to support Mycroft in early May.
What sets Mycroft apart from the competition?
Mycroft is open so enterprises can white label, customize and extend it. Companies can control their data and provide customers with a branded voice experience.
You’ll be at Pycon this year. Tell us about that.
This will be Mycroft's first attendance of PyCon, and we are very excited to be in attendance. Mycroft is written in Python and being an open-source language, it's perfect for open-source projects like ours. It's in Portland, OR, May 17–25 and it will be a great opportunity for new Python developers to reach out to us. We were lucky to get hosted by Intel, so if anyone is interested, we'll be in the Intel area of the convention hall. We will also stick around for the Sprint Sessions if any developers want to help us with a few ideas or start their own. We would love to chat!
Want to learn more about the Intel® Software Innovator Program?
You can read about our innovator updates, get the full program overview, meet the innovators and learn more about innovator benefits. We also encourage you to check out Developer Mesh to learn more about the various projects that our community of innovators are working on.
Interested in more information? Contact Wendy Boswell on Twitter