I have read/performed some chipwhisperer jupyter courses and decided to reproduce the
“Fault 1_1 - Introduction to Clock Glitching”-course on an Atmega328 chip on a breadboard.
I uploaded the code for the Atmega and the jupyterfile to my google drive:
atmega firmware + jupyter notebook file
I worked a bit different than the course:
-
I used the CW-lite’s clock and divided it to a 16Mhz clock, so I could still upload firmware with my FTDI programmer at default baudrate
-
Instead of working with “Simpleserial” I chose to detect a glitch on the atmega itself and toggle a pin accordingly. The calculation is performed once during setup so I reboot the device in each glitch loop.
So when I run my setup, things do not work out as expected.
Since I reboot each cycle and wait for a trigger with a timeout of 4s, I expect on my serial output of my ftdiprogrammer “Started!” should be printed for each loop. In reality I see it only printed once.
In an attempt to debug this I started to print the trig_count in each loop:
print("trig_count: " + str(scope.adc.trig_count))
The count varies a lot for each loop but was in the range of 239838 - 971713 so the time between trigger high and low appears to be 15ms - 60ms.
When I then reprogram the atmega with a 1s delay between its trigger the time measured from the trig_count was the same as before (would expect it to be at least one second).
Can anyone point out where I am in the wrong?