Raspberry Pi 4 boot-from-mass-storage functionality enters beta phase

The techies from Raspberry Pi have just announced that the ability to boot from a mass storage device with Raspbian installed without an SD card is one step closer today. There is now a public beta programme for the facility and you can get involved by using your own hard drive or pen drive. Head over to the Raspberry Pi Forums where you can find the instructions. Please also use the forum thread to feedback any issues with the operating system giving as much detail of your set-up as you can.

I’m part of the Alpha testing team and I can confirm that “it worked for me!” 🙂

Create a London Eye model and turbine station with a micro:bit

This is lovely and also a great micro:bit resource. The folks over at Kitronik have written a detailed explanation of how to build this great London Eye and turbine station model over on their blog. It uses their All-in-One Robotics Board to drive both the “Eye” and the wind turbine on top of the turbine station. They include all the files to help you laser-cut the models (although I daresay you could do the same out of cardboard manually) and extensive instructions on how it goes together. This was part of their Bett 2019 display and you can read all the details and see the information about their “City” display here.

Raspberry Pi + new HQ Camera = PiDSLR (Guest post from David Booth!)

by David Booth, Hitchin Hackspace

Overview

I’ve been loaned a new Raspberry Pi High Quality (HQ) camera with both the 6mm wide-angle lens and the 16mm telephoto lens by a very generous benefactor. It was so I could create an automated 360-degree turntable lightbox with integrated HQ camera to take consistent, rotatable photos of robots at the next Pi Wars event.

Whilst in my possession, I wanted to take a bunch of photos through it to see the results and inspect them for image quality, light-gathering ability, focus control and a few other features. The easiest way for me to do this was to make a portable camera where I can put it through its paces, hence the idea of PiDSLR pinged into my head. (I know this is technically a mirrorless camera, but I thought the PiDSLR name was cooler!)

Objectives

  1. Design and 3D-print a simple mount for the camera to connect to a Raspberry Pi.
  2. Create a way of viewing a live preview so that I can focus the lens whilst out and about.
  3. Create a way of triggering a photo to be taken.
  4. Put the loaned Raspberry Pi HQ Camera and lens kit through its paces.

Putting it all together

Using only parts I had lying around the house, I needed to find a way of making this work. I have Raspberry Pis “up the wazzu”, and I have the new HQ camera so that wasn’t a problem.

Creating the preview screen posed a bit of a dilemma. Having taken a “few” photos in my past (probably around 10-a-day on average, every day for the last 10 years) I know that when outside, preview screens can easily be washed out, so I wanted to go for an electronic viewfinder (EVF)-style preview. I searched for a while and found I had a few options in my house that fit the bill.

Option one was my old Panasonic FZ20 which has a ½-inch Electronic ViewFinder (EVF). A bit of Googling with no results on how to connect anything like this to a Pi and I decided against destroying something that, technically, still works.

Option two was an Adafruit 128×128 TFT 1.4 inch screen that I used for Pi Eyes last halloween that I could use in conjunction with a single Google Cardboard lens to create a homemade EVF. Googling this showed it wasn’t going to be simple to set up. There’s precious little information that will allow me to use the screen with the Raspbian desktop GUI, and I really didn’t want to go down the route of a bespoke image buffer preview screen for something that is only a test. After a bit of playing with the Pi Eyes scripts, I did get the screen working as a desktop for Raspbian, but looking through the Google Cardboard lens magnified everything so I was looking at the individual RGB pixels rather than an overview of the image, so I had to give up on this.

Option three was an old Adafruit 320×240 touch screen I bought years ago for something I can’t remember. It neatly connects to the Raspberry Pi via the GPIO headers, fits within but fills the Pi footprint, so looks reasonable and has a setup script from the manufacturer that configures it to show the Raspbian desktop…SCORE!

Designing the Mount

Knowing my way around a 3D printer and CAD system, I knew I wanted to print a mount for the camera. I didn’t want to design something that would take hours to print but I wanted something neat and practical. The idea of a simple base plate that the Pi would mount to on one side and the camera on the other was the obvious choice. At least for a test, anyway. I started by using the camera on a piece of paper as a stencil and drew around it, putting dots in the mount holes. I then placed a Pi over the sketch and similarly drew out its mount points. The position of the camera connector on the Pi and the camera lined up reasonably well so that I could reuse one of the mount points between them and thus save a bolt! 🙂

I then used Onshape to draw up the sketch, extrude and fillet (for roundness style). Slicing the design in Cura showed 38 minutes to print, which is well within my “less than an hour, that’s fine” timeframe.

It’s all too easy to not take into account components on the back of PCBs when designing 3D mounts. And this was no exception. Whilst printing I noticed that I would need a spacer between the Pi and the mount because the bolt heads holding the camera on the other side would be touching the back of the Raspberry Pi, potentially shorting out components. A simple solution was to quickly design and print short plastic pipes as spacers.

You can find the Onshape object for the mount here.

Assembling the parts

Assembly was super-simple, in theory. First, I attached the camera to the mount with 3 bolts and spacers leaving the shared bolt hole empty for now. Then, I flipped over the board and mounted the Pi on the reverse, remembering to insert the spacers on the back and also for the fourth shared bolt hole for the camera.

The trickiness came when trying to tighten the M2.5 nuts around the camera as they are very close to the frame and anything other than needle nose pliers just wouldn’t fit easily. The pliers also aren’t great as they have a tendency to slip off and cause frustration, so remember: if you replicate this project, take your time when doing this!

Before attaching the Adafruit 320×240 screen, I could either solder buttons directly to the GPIO pass-through pads or, if I wanted an extendable trigger button, I could solder short wires to the #27 pad and screw them into a female 3.5mm screw terminal jack. I decided I wanted the trigger button, so I soldered a couple of wires to a momentary push button and screwed them into a male 3.5mm screw terminal jack.

All that was needed then was to slot the screen onto the GPIO pins.

Photographs from the build, at various stages, can be found here.

Software

Firstly, you need to do the obvious step of installing Raspbian on a micro SD card. Open up a terminal and then enter the following commands to bring your Pi up-to-date:

sudo apt update

sudo apt upgrade

Then, enable the camera module by running:

sudo raspi-config

and using the Interfaces menu.

Then, navigate to your home directory:

cd ~

Install the Adafruit 320×240 screen by following their instructionsIMPORTANT: Make sure that you say no to having the PiTFT as a text console and choose yes when asked if you want it to be an “HDMI mirror”.

Once you’ve set-up the screen, you can download the Python module for the Adafruit PiTFT screen created by elParaguayo that handles button presses and the backlight.

The correct way to get the module would be to clone the repo:

git clone https://github.com/elParaguayo/PiTFT_Screen.git

and move the pitftgpio.py file into your home folder (don’t forget the “.” on the end!):

mv PiTFT_Screen/pitftgpio.py .

Or you can do as I did, which is to simply download the Python file that you need:

wget https://raw.githubusercontent.com/elParaguayo/RPI-Info-Screen/PiTFT/pitftgpio.py

Now we need to create a Python file that tests for button presses and calls the raspistill command.

nano camera_button.py

Enter the following into the camera_button.py file:

from pitftgpio import PiTFT_GPIO
import datetime
import subprocess

pitft = PiTFT_GPIO()
last_pressed = datetime.datetime.now()
while True:
    if pitft.Button3:  # pin 27 on this board
        # Get current date/time of buton press
        now = datetime.datetime.now()

        # Stop multiple sequential button presses by testing time
        # difference to the last known pressed date/time.
        if (now - last_pressed).total_seconds() >= 0.5:  # allow if over half a second between presses
            last_pressed = now
            print "Camera button pressed"

            # Format out the date/time for appending to the filename
            filename1 = now.strftime("%Y_%m_%d-%H%M%S")

            # Format out the command we want to start and call it using subprocess module
            cmd = "raspistill -vf -hf -rot 180 -t 10000 -o /home/pi/Pictures/img_{}.jpeg".format(filename1)
            subprocess.call(cmd, shell=True)

Running on boot

There are a few ways to run a command when the Raspberry Pi boots. The best would be to create a systemctl script that is told to start at boot. The advantage with this is that it is easy to start/stop/restart the script via the command line and it can also be told to start after critical components are ready. The main disadvantage with this method is that I can never remember how to make the boot script or the commands that you use to enable it. (Mike: I covered this for a shutdown script previously)

A second, easier and as far as I’m aware, still reasonable way is to use the @reboot option in crontab. To do this, edit the crontab file:

crontab -e

and add the following line to the bottom of the file:

@reboot export DISPLAY=:0 && python /home/pi/camera_button.py &

A third and frowned-upon way (and the way I used because it’s super-simple) is to add the following lines to the /etc/rc.local file.

# Tell raspistill to show its preview on the local screen.
export DISPLAY=:0
# Run the python camera script
/usr/bin/python /home/pi/camera_button.py &

IMPORTANT: the ‘&’ symbol at the end of the commands above is critical. This tells the bash prompt to run the Python script in the background, without waiting for it to finish. Without that symbol, the Pi will get stuck and not boot at all.

Results

It takes a few tries to get your head around the manual focusing within the 10-second preview time and I definitely found the 16mm telephoto lens much easier to use in that respect. (A guide and a note on focusing the 6mm lens can be found here). The resulting photos are a significant improvement on the standard Raspberry Pi camera but, obviously, I won’t be swapping my proper DSLR camera for this just yet (although the five to ten hour battery life I would get from the Sandstrom 12.8 amp hr USB battery pack is better than my DSLR’s battery life!)

All the full-resolution shots taken with the HQ camera can be found here.

This is my full, static set-up on a tripod with trigger button attached:

Full disclosure: the following photos from the Pi HQ camera have been mildly tweaked in LightRoom for colour correction, but this is my standard process after I take photos with my DSLR anyway.

16mm Telephoto Lens

6mm Wide Angle Lens

 

The 4tronix Really Useless Box for micro:bit – a quick review

Hi everyone. I’ve just returned from my shed where I’ve been putting together the 4tronix Really Useless Box. This is a kit for the micro:bit made out of 7 PCBs, a nice switch and has a servo which tells you “No, you may not open this box!” in no uncertain terms!

The kit comes in an anti-static bag and there are a lot of parts to make the end project. You start out by screwing on mount fittings and then you add the servo to one of the side walls. You then build up the rest of the box around that part. The instructions that are available online are clear enough, although sometimes I did wonder which way round things went. There are a lot of screws in the kit, and sometimes it’s not quite clear which ones go where, but I worked it out based on how many of each there were and didn’t make any mistakes the whole way through, so that must speak to how good the instructions are.

At the end of the build, you are left with a box with a micro:bit as one of the end walls. You flick the switch to open the box and then the servo controls an arm which politely, but resolutely, flicks the switch back again to close the lid. It’s entirely pointless (hence the name), but it is great fun to build. It’s also very good value (at £12) and took me about an hour and a half to put together. You could probably assemble it in a lot less time than I did – maybe 45 minutes – but I was enjoying myself 🙂

It also comes with all the code you need to calibrate the servo and program the full box in Makecode. This means it’s an ideal project for beginners, or for someone who fancies a bit of fun 🙂

You can buy the Really Useless Box from 4tronix. You’ll need to provide your own micro:bit.

Building your first robot with the micro:bit – a new tutorial from @raspibotics

Getting started with robotics is always tricky. For Raspberry Pi robotics, I always point people at the Pi Wars Hints and Tips guide. For micro:bit, however, you might find this guide a great place to start to build the robot pictured above. It uses a micro:bit, obviously, and the Kitronik Motor Controller together with two micro metal gear motors and a ball caster as the guts of the robot. It then uses a second micro:bit as the controller for the robot, operating over radio to issue commands to the receiver onboard the robot.

You can read the entire guide here and see a video of the robot working below:

Entirely useless keyboard made by inserting floppies uses a Raspberry Pi and a Teensy LC

Foone Turing (follow him on Twitter here) has done some crazy builds in the past but this time he’s gone for… entirely useless!

He’s hooked up an old floppy drive to a Raspberry Pi which reads the disk insertion events and then looks to see what the volume label is: the volume labels are letters or special keys like SHIFT. These are then sent to a Teensy LC which converts the signals into proper keystrokes which it then sends to a host PC. It’s completely batty, but I love it! Talk about whimsical 🙂 You can see it in action below:

Thanks to Hackaday for spotting this one.