|  | 
| 1 |  | -[](https://app.circleci.com/pipelines/github/danic85/companion-robot) | 
| 2 |  | - | 
| 3 |  | -# Robotics Development Framework | 
| 4 |  | -This platform has been created to allow modular development and experimentation of robotics in python / C++ using the Raspberry Pi and Arduino. | 
| 5 |  | - | 
| 6 |  | -## Coral TPU Accelerator | 
| 7 |  | - | 
| 8 |  | -To use the Googla Coral USB Accelerator, first flash the Pi SD card with the image found in the [AIY Maker Kit](https://aiyprojects.withgoogle.com/maker/) | 
| 9 |  | -([Download as of 2022-08-05](https://github.com/google-coral/aiy-maker-kit-tools/releases/download/v20220518/aiy-maker-kit-2022-05-18.img.xz)) | 
| 10 |  | - | 
| 11 |  | -(I attempted to install the required software from the coral [getting started guide](https://coral.ai/docs/accelerator/get-started#1-install-the-edge-tpu-runtime) but I was unable to get past an error relating to grpico "GLIBC_2.29' not found") | 
| 12 |  | - | 
| 13 |  | -Alternatively, set Config.vision.tech to `opencv` for the original (slower) facial recognition. I am not updating this anymore so you may find some integration issues. | 
| 14 |  | - | 
| 15 |  | -## Installation | 
| 16 |  | -``` | 
| 17 |  | -chmod 777 install.sh | 
| 18 |  | -./install.sh | 
| 19 |  | -``` | 
| 20 |  | - | 
| 21 |  | -Disable audio (see Neopixels section below) | 
| 22 |  | - | 
| 23 |  | -## Running | 
| 24 |  | -``` | 
| 25 |  | -./startup.sh | 
| 26 |  | -``` | 
| 27 |  | -To execute manual control via keyboard: | 
| 28 |  | -``` | 
| 29 |  | -./manual_startup.sh | 
| 30 |  | -``` | 
| 31 |  | -To execute startup including a preview of the video feed (not available via SSH): | 
| 32 |  | -``` | 
| 33 |  | -./preview_startup.sh | 
| 34 |  | -``` | 
| 35 |  | - | 
| 36 |  | -###Testing | 
| 37 |  | -``` | 
| 38 |  | -python3 -m pytest --cov=modules --cov-report term-missing | 
| 39 |  | -``` | 
| 40 |  | - | 
| 41 |  | -## Run on Startup | 
| 42 |  | - | 
| 43 |  | -Execute `sudo vim /etc/rc/local` and add the following lines before the `exit 0` statement: | 
| 44 |  | -``` | 
| 45 |  | -python3 /home/archie/companion-robot/shutdown_pi.py | 
| 46 |  | -/home/archie/companion-robot/startup.sh | 
| 47 |  | -``` | 
| 48 |  | - | 
| 49 |  | -### Auto shutdown | 
| 50 |  | -GPIO 26 is wired to allow shutdown when brought to ground via a switch.  | 
| 51 |  | - | 
| 52 |  | -The script `shutdown_pi.py` manages this. | 
| 53 |  | - | 
| 54 |  | -Guide: | 
| 55 |  | -https://howchoo.com/g/mwnlytk3zmm/how-to-add-a-power-button-to-your-raspberry-pi | 
| 56 |  | - | 
| 57 |  | -## Features | 
| 58 |  | - | 
| 59 |  | -### Facial detection and tracking | 
| 60 |  | -Using the Raspberry Pi camera | 
| 61 |  | - | 
| 62 |  | -### Servo control | 
| 63 |  | -Control of up to 9 servos via an arduino serial connection | 
| 64 |  | - | 
| 65 |  | -### Battery monitor | 
| 66 |  | -Both external and software integrated via the arduino serial connection | 
| 67 |  | - | 
| 68 |  | -### Buzzer | 
| 69 |  | -A buzzer is connected to GPIO 27 to allow for tones to be played in absence of audio output (see Neopixel below). | 
| 70 |  | -https://github.com/gumslone/raspi_buzzer_player.git | 
| 71 |  | - | 
| 72 |  | -### Motion Sensor | 
| 73 |  | -An RCWL-0516 microwave radar sensor is equipped on GPIO 13 | 
| 74 |  | - | 
| 75 |  | -### Stereo MEMS Mics | 
| 76 |  | -GPIO 18, 19 and 20 allow stereo MEMS microphones as audio input | 
| 77 |  | -``` | 
| 78 |  | -Mic 3V to Pi 3.3V | 
| 79 |  | -Mic GND to Pi GND | 
| 80 |  | -Mic SEL to Pi GND (this is used for channel selection, connect to either 3.3V or GND) | 
| 81 |  | -Mic BCLK to BCM 18 (pin 12) | 
| 82 |  | -Mic DOUT to BCM 20 (pin 38) | 
| 83 |  | -Mic LRCL to BCM 19 (pin 35) | 
| 84 |  | -``` | 
| 85 |  | -https://learn.adafruit.com/adafruit-i2s-mems-microphone-breakout/raspberry-pi-wiring-test | 
| 86 |  | - | 
| 87 |  | - | 
| 88 |  | -``` | 
| 89 |  | -cd ~ | 
| 90 |  | -sudo pip3 install --upgrade adafruit-python-shell | 
| 91 |  | -wget https://raw.githubusercontent.com/adafruit/Raspberry-Pi-Installer-Scripts/master/i2smic.py | 
| 92 |  | -sudo python3 i2smic.py | 
| 93 |  | -``` | 
| 94 |  | - | 
| 95 |  | -####Test | 
| 96 |  | -`arecord -l` | 
| 97 |  | -`arecord -D plughw:0 -c2 -r 48000 -f S32_LE -t wav -V stereo -v file_stereo.wav` | 
| 98 |  | - | 
| 99 |  | -_Note:_ See below for additional configuration to support voice recognition | 
| 100 |  | - | 
| 101 |  | -### Speech Recognition | 
| 102 |  | -Trigger word for voice recognition (currently not used): | 
| 103 |  | -https://snowboy.kitt.ai/ | 
| 104 |  | - | 
| 105 |  | -Speech recognition is enabled whenever a face is visible.  | 
| 106 |  | -Ensure that the `device_index` specified in `modules/speechinput.py` matches your microphone.  | 
| 107 |  | - | 
| 108 |  | -See `scripts/speech.py` to list input devices and test. See below for MEMS microphone configuration | 
| 109 |  | - | 
| 110 |  | -### MEMS Microphone configuration for speech recognition | 
| 111 |  | - | 
| 112 |  | -By default the Adafruit I2S MEMS Microphone Breakout does not work with speech recognition.  | 
| 113 |  | - | 
| 114 |  | -To support voice recognition on the MEMS microphone(s) the following configuration changes are needed. | 
| 115 |  | - | 
| 116 |  | -`sudo apt-get install ladspa-sdk` | 
| 117 |  | - | 
| 118 |  | -Create `/etc/asound.conf` with the following content: | 
| 119 |  | - | 
| 120 |  | -```  | 
| 121 |  | -pcm.pluglp { | 
| 122 |  | -    type ladspa | 
| 123 |  | -    slave.pcm "plughw:0" | 
| 124 |  | -    path "/usr/lib/ladspa" | 
| 125 |  | -    capture_plugins [ | 
| 126 |  | -   {    | 
| 127 |  | -      label hpf | 
| 128 |  | -      id 1042 | 
| 129 |  | -   } | 
| 130 |  | -        { | 
| 131 |  | -                label amp_mono | 
| 132 |  | -                id 1048 | 
| 133 |  | -                input { | 
| 134 |  | -                    controls [ 30 ] | 
| 135 |  | -                } | 
| 136 |  | -        } | 
| 137 |  | -    ] | 
| 138 |  | -} | 
| 139 |  | -
 | 
| 140 |  | -pcm.lp { | 
| 141 |  | -    type plug | 
| 142 |  | -    slave.pcm pluglp | 
| 143 |  | -} | 
| 144 |  | -``` | 
| 145 |  | - | 
| 146 |  | -This enables the device 'lp' to be referenced in voice recognition. Shown with index `18` in the example below. | 
| 147 |  | - | 
| 148 |  | -Sample rate should also be set to `16000` | 
| 149 |  | - | 
| 150 |  | -`mic = sr.Microphone(device_index=18, sample_rate=16000)` | 
| 151 |  | - | 
| 152 |  | -References:  | 
| 153 |  | - | 
| 154 |  | -* [MEMS Microphone Installation Guide](https://learn.adafruit.com/adafruit-i2s-mems-microphone-breakout/raspberry-pi-wiring-test) | 
| 155 |  | - | 
| 156 |  | -* [Adafruit Support discussing issue](https://forums.adafruit.com/viewtopic.php?f=50&t=181675&p=883853&hilit=MEMS#p883853) | 
| 157 |  | - | 
| 158 |  | -* [Referenced documentation of fix](https://github.com/mpromonet/v4l2rtspserver/issues/94) | 
| 159 |  | - | 
| 160 |  | -### Serial communication with Arduino | 
| 161 |  | - | 
| 162 |  | -In order to use the Raspberry Pi’s serial port, we need to disable getty (the program that displays login screen) | 
| 163 |  | - | 
| 164 |  | -`sudo raspi-config ->  Interfacing Options -> Serial -> "Would you like a login shell to be accessible over serial" = 'No'. Restart` | 
| 165 |  | - | 
| 166 |  | -#### Connection via serial pins | 
| 167 |  | -Connect the Pi GPIO 14 & 15 (tx & rx) to the arduino tx & rx (tx -> rx in both directions!) via a logic level shifter, as the Pi is 3v3 and the arduino is (most likely) 5v. | 
| 168 |  | - | 
| 169 |  | -####Upload to Arduino over serial pins | 
| 170 |  | -To upload over serial pins, press the reset button on the Arduino at the point that the IDE starts 'uploading' (after compile), otherwise a sync error will display. | 
| 171 |  | - | 
| 172 |  | -### Neopixel | 
| 173 |  | - | 
| 174 |  | -WS1820B support is included via the Pi GPIO pin 12. Unfortunately to support this you must disable audio on the Pi. | 
| 175 |  | - | 
| 176 |  | -``` | 
| 177 |  | -sudo vim /boot/config.txt | 
| 178 |  | -#dtparam=audio=on | 
| 179 |  | -``` | 
| 180 |  | - | 
| 181 |  | -This is also why the application must be executed with `sudo` | 
| 182 |  | - | 
| 183 |  | -https://learn.adafruit.com/neopixels-on-raspberry-pi/python-usage | 
| 184 |  | - | 
| 185 |  | -## PCBs | 
| 186 |  | -Prefabricated PCBs are available for this project in the `circuits` folder. This allows the connection between the core components as described above. | 
| 187 |  | - | 
| 188 |  | - | 
| 189 |  | - | 
| 190 |  | - | 
|  | 1 | +# Open Source, 3D Printable, Modular Bipedal Robot Project | 
|  | 2 | + | 
|  | 3 | +The **Modular Bipedal Robot** project aims to educate and inspire individuals interested in robotics and electronics. This open-source initiative focuses on creating a fully autonomous companion robot with a variety of advanced features. | 
|  | 4 | + | 
|  | 5 | +## Key Features | 
|  | 6 | + | 
|  | 7 | +- **Bipedal Design**: The robot includes articulated legs for bipedal movement. | 
|  | 8 | +- **Control Systems**: Utilizes Arduino and Raspberry Pi, managed through custom PCBs. | 
|  | 9 | +- **Modular Body**: Configurable body components allow for easy customization and adaptability. | 
|  | 10 | +- **Software Modules**: | 
|  | 11 | +  - Animation: Handles the animation of the robot, including walking, turning, and other movements. | 
|  | 12 | +  - Braillespeak: Converts text to Braille and speaks it using a proprietary audio output using the onboard buzzer. | 
|  | 13 | +  - Buzzer: Controls the buzzer for audio output. Includes the ability to play tones and melodies. | 
|  | 14 | +  - ChatGPT: Uses the OpenAI GPT models to process text based on user input. | 
|  | 15 | +  - Logging: Logs data to a file for debugging and analysis. | 
|  | 16 | +  - Motion Detection: Handles motion detection using an onboard microwave motion sensor. | 
|  | 17 | +  - Neopixel: Controls the onboard Neopixel LEDs for visual feedback. | 
|  | 18 | +  - PiServo: Controls the servos connected to the Raspberry Pi. | 
|  | 19 | +  - PiTemperature: Reads the temperature from the integrated temperature sensor on the Raspberry Pi. | 
|  | 20 | +  - RTLSDR: Uses an RTL-SDR dongle to receive and process radio signals. | 
|  | 21 | +  - Serial Connection: Handles serial communication between the Raspberry Pi and Arduino. | 
|  | 22 | +  - Servos: Controls the servos connected to the Arduino via the Raspberry Pi and the serial connection. | 
|  | 23 | +  - Tracking: Uses computer vision to track objects and faces using the onboard camera. | 
|  | 24 | +  - Translator: Translates text between languages using the Google Translate API. | 
|  | 25 | +  - TTS: Converts text to speech using the onboard speaker. | 
|  | 26 | +  - Viam: Uses the VIAM API to integrate Viam modules for additional functionality. | 
|  | 27 | +  - Vision: Handles image processing and computer vision tasks using the onboard IMX500 Raspberry Pi AI camera. | 
|  | 28 | +  - [Read more](https://github.com/makerforgetech/modular-biped/wiki/Software#modules)! | 
|  | 29 | + | 
|  | 30 | +## Project Background | 
|  | 31 | + | 
|  | 32 | +The Modular Biped Robot Project is designed to provide a flexible and modular framework for robotics development using Python and C++ on the Raspberry Pi and Arduino platforms. It aims to enable developers, robotics enthusiasts, and curious individuals to experiment, create, and customize their own biped robots. With a range of features and functionalities and the option to add your own easily, the Modular Biped Robot Project offers an exciting opportunity to explore the world of robotics. | 
|  | 33 | + | 
|  | 34 | +## Modularity | 
|  | 35 | + | 
|  | 36 | +The open source framework is designed for flexibility, allowing users to easily add or remove components to suit their specific needs. Comprehensive [guides](https://github.com/makerforgetech/modular-biped/wiki/Software#creating-a-module) are provided for integrating new modules seamlessly. | 
|  | 37 | + | 
|  | 38 | +## Resources | 
|  | 39 | + | 
|  | 40 | +- **Documentation**: For detailed information, visit the project’s GitHub wiki: [Modular Biped Documentation](https://github.com/makerforgetech/modular-biped/wiki) | 
|  | 41 | +- **Code**: Check out the modular open source software on [GitHub](https://github.com/makerforgetech/modular-biped) | 
|  | 42 | +- **YouTube Playlist**: Explore the development process through our build videos: [Watch on YouTube](https://www.youtube.com/watch?v=2DVJ5xxAuWY&list=PL_ua9QbuRTv6Kh8hiEXXVqywS8pklZraT) | 
|  | 43 | +- **Community**: Have a question or want to show off your build? Join the communities on [GitHub](https://bit.ly/maker-forge-community) and [Discord](https://bit.ly/makerforge-community)! | 
0 commit comments