Final project 2025. Secret Messages.¶
Note: This material will be updated during the course with clarifications and additions to the content. During the course, we will also add hints to help with common problems and other information. So, check this page from time to time!
Last updated: 28.10.2025
- Added instructions for using the serial client.
- Sligthly modification in the message protocol. Now ends with two spaces and line feed instead of three spaces.
Previous version:
- 1.10.2025:
- First version available
- 21.10.2025
- Updated final deadline for project.
- Adjusted slightly grading. Now Tier 2 is 11 points and Tier 3 is 3 points.
Previous version:
Introduction¶
Students are working for a government intelligence agency. Your task is to develop a communication device that allows agents to send messages to each other using our own system. For this purpose, the Raspberry Pi Pico together with the JTKJ Hat will serve as the communication tool.
Please, if you have any question regarding this material, we have prepared an extensive FAQ section. Have it a look, your answer might be there. Otherwise, you can check also Discord. If you do not find your answer anywhere do not hesitate to contact us.
Learning goals¶
In this project, the students will learn to use different peripherals connected to the Raspberry Pi Pico while working with FreeRTOS as the real-time operating system.
At the basic completion level, the students will be able to recognize certain positions of the device using an IMU sensor and transmit a character, based on these positions, to an external device via serial communication.
At more advanced tiers, the students will learn how to interact with different sensors and actuators in order to create a better user experience.
Deliverables¶
- Project Plan. Delivered by: 2025-10-24 23:59. The project plan summarizes what the students are going to implement.
- Final Project. Delivered by: 2025-11-23 23:59. Includes the program code you are implementing and, if applicable, a video presentation.
- Diary. At least one entry per week. Delivered by: 2025-11-23 23:59. An individual diary is required for each student.
- Module 2 Quiz. Dates available at Deadlines section. Invigilated quiz.
- Evaluation Meeting. Times available during weeks 48–51.
Assessment criteria¶
General assessment criteria is defined in Assessment section.
Each of the Tiers deliverables they have particular assessment criteria that is accessible from the Return box:
Project specifications¶
Secret messages¶
In the project work you and your team need to implement a messaging program based on Morse-code. Morse alphabet is a part of a messaging system developed by Samuel Morse in 1838, that was used to deliver messages via telegraph. Nowadays Morse alphabet and telegraphy are only used rarely, but for example the reconnaissance officers with radio training (also called "Sissiradisti" in Finnish) of the Finnish Defense Forces learn and use them.
Radio amateurs are also sometimes required to learn these skills to have permissions to broadcast at certain frequencies.
Radio amateurs are also sometimes required to learn these skills to have permissions to broadcast at certain frequencies.
The program to be implemented as a project work will recognize the dots, dashes, and spaces needed to form Morse-coded messages, based on data captured from the device’s IMU or, for example, from button presses. The produced messages will then be transferred to a workstation program, where they are converted back into text.
In some situations, the program developed for the course device should also be able to replay a Morse-coded message that has been sent to it.
In some situations, the program developed for the course device should also be able to replay a Morse-coded message that has been sent to it.
- NOTE: For the basic project, your application does not need to perform the translation between Morse code and letters, or vice versa.
Message protocol¶
The communication happening between the device and the workstation abides to the international Morse code alphabet with some exceptions and application-specific clarifications. These exceptions are listed below.
- Every character encoded into Morse-alphabet is sent as a series of dashes "-" and dots "."
- Between every Morse-encoded character (e.g. "a" => ".-"), there is one space " "
- There is two spaces between words
- A message ends to two spaces and a line feed (\n)
Example¶
To make the rules defined above more concrete, we have an example of the usage of our protocol below.
Word to send: aasi
Encoding the word to a sequence of Morse-encoded characters, inserting one space in between of every character:
aasi on=> . - ␣ . - ␣ . . . ␣ . . ␣ ␣ - - - ␣ - . ␣ ␣ ␊
(␣ = space). Extra spaces are added to make the symbols more visible.
The final message to be sent: .-␣.-␣...␣..␣␣---␣-.␣␣␊
Notice how the message ends with 2 spaces and line feed (\n).
Workstation program¶
The course's virtual machine already has a program installed that sends and receives messages in Morse code using the USB serial communication. You can start it using the "Serial Client" shortcut on the desktop. If this doesn't work, you can use the virtual machine's terminal and first navigate to the program's folder with the command
cd ~/utils/serial_client, then start the program with the command python3 src/main.py.When it starts, the program reads the
config.json-file located in the same folder, which contains the necessary configurations for using different development boards. Currently, the configuration file already has settings for the devices used in this course (Pico W), but it is also possible to include other devices. If the settings are read successfully, the program will print Config loaded with 3 known devices to the screen, after which it will wait for a known device to be connected.When you connect a known device to the workstation, it switches to messaging mode. In this mode, any messages typed into the terminal (except for program commands) are sent to the connected device in Morse code via USB serial communication. Data received from the device in Morse code is printed to the terminal. When two spaces followed by line feed, which indicate the end of a message, are detected, the message is converted into text and printed in the window. In this way, you can see both the transmitted message and the characters that make it up, which can help with debugging the program.
Commands¶
As mentioned before, all messages containing only allowed characters are sent to the device via USB in Morse-encoded format, with the exception being the program commands. In addition, all the text between __ and __ is not decoded as morse, so you can add debug message between morse code by starting and ending the message by __.
Following commands can be used to control the program.
.clearclears the terminal screen.exitstops the execution of the program
Installation in your own computer¶
Source code and instructions to install and run the program in students' computer can be found from serial-client gitlab project
Tiers and Requirements¶
The deliverables are divided into three tiers. Completing the basic tier (Tier 1) is required to pass the course. The other two tiers are optional and can be completed to improve your grade.
Tier 1: Basic Functionality (6 points)¶
- All members of the group must participate in the development of the program. Just testing or designing is not enough.
- The students must use the Raspberry Pi Pico W together with the JTKJ Hat provided during the course.
- The program must be written in the C language.
- The program must be developed using FreeRTOS, with at least two tasks.
- The code must compile successfully and be flashable to the device.
- The code must include comments explaining the program structure and the functionality of its different parts.
- The program must follow the coding guidelines presented in the course. Variable and function names must be meaningful.
- The program must use the folder structure presented in the mandatory exercises.
- Students must record a video demonstrating the functionality of their program. Requirements for the video are described below.
FUNCTIONAL REQUIREMENTS
- The program must detect device positions to generate Morse symbols. At least two positions must be recognized. For example, leaving the device on a table may represent a dot (.), while rotating it 90 degrees may represent a dash (-). Buttons can be used to send a space, to confirm a position, or to trigger the sending of the entire message.
2. The program must be able to send the symbols (dot, dash, and space) to the workstation via USB serial communication.
Requirements for the Video¶
- The video must be recorded in English or Finnish, preferably with captions.
- The video length should be between 1 and 2.5 minutes.
- If the video is uploaded to Lovelace, it must not exceed 50 MB. If it is uploaded to any online platform, there is no size limit.
- If the video is uploaded to an online platform, the link must be shared in the submission box. Make sure the video is not private — course staff must be able to access it.
- All members of the group must appear in the video.
- The video demo must show the following steps:
- Code compiles and can be flashed to the device.
- By changing the position of the device, different symbols are sent to the workstation.
- How the signal is received at the workstation and how the message is decoded. Both the device and the workstation terminal must be visible at the same time in the same shot (no separate windows).
- The students must discuss the following aspects:
- How many tasks are used, and what is the functionality of each task?
- How many interrupts are used. For each interrupt, define: the source, the interrupt handler, and a short description of its function.
- How many global variables are used. What are they used for? What is the data type of each global variable? Why was this type selected?
- Are there any synchronization mechanisms used between tasks? Which one and why?
- How do you detect the positions? Do you use any filtering or data processing? Which sensor data do you use? What threshold values are applied to detect the positions?
- How do you send the data to the workstation? Which serial communication protocol do you use? How is the data structured?
- Explain the contribution of each individual member.
Assessment¶
The project is assessed on a pass/fail basis. Meeting the requirements grants 6 points.
Tier 2: Full-Duplex Communication and User Interface (Max. 11 points)¶
- All requirements from Tier 1 must be fulfilled.The video is not required.
Minimum Requirements¶
- It is mandatory to use one or more state machines in the implementation of the program.
- In addition to sending symbols, the program must also be able to receive messages from the workstation. Received symbols must be displayed on the LCD and represented with another actuator (e.g., buzzer or LED).
- The program must provide feedback when a message is sent successfully.
- ALL students in the group must participate in the final meeting (see below).
NOTE: There is no need that your application is able to do the translation between Morse code and letters or viceversa.
Additional Functionalities¶
- Symbols can be generated using:
- Light sensor
- Microphone
- IMU (detecting 2–3 different movements instead of positions)
- Requires data collection and analysis to implement. See more details below.
- A combination of the above
In this case, implementing the IMU provides more points than the microphone, which in turn provides more points than the light sensor. A robust IMU detection algorithm will provide the highest number of points.
2. One of the following possible additions:
- Menu structure in the user interface using a combination of buttons, sounds, LEDs, and the display.
- Music played with the buzzer at some point in the application.
- Detailed data collection and analysis for the IMU implementation.
- Translation of morse symbols to letters
- Use of advanced synchronization mechanisms such as mutex semaphores and/or inter-task communication using queues or messages.
- Any other aspect that you can consider as extra work and is not included in Tier 3.
Final Meeting¶
- The meeting can be held in English, Finnish, or a mix of both (“Finglish”).
- During the final meeting, all members of the group must be present. Absent member(s) will not receive a grade until they have clarified their contribution to the work.
- Each member of the group must be able to answer questions about the program implementation and the relation between theory and practice. FAILURE TO ANSWER BASIC QUESTIONS WILL RESULT IN A GRADE OF 0 FOR THE ENTIRE PROJECT.
- Each member of the group must be able to explain their personal contribution to the project.
- The students must present a demo of the program, showing the implemented functionalities. The demo is flashed from the teacher’s computer, and the serial communication program is also run from the teacher’s computer.
- At the end of the meeting, the students must return the device.
Assessment¶
Check the detailed assessment criteria in the Return Box for Tier 2.
Tier 3: Inter-Project Communication / Library development(Max. 3 points)¶
- All minimum requirements from Tier 2 must be fulfilled.
- OPTION 1 : The program must be able to communicate (using Morse code) with the device of another course team.
- Communication channel is up to the groups to decide:
- WiFi
- UART
- Ligt (LED + Light sensor)
- Sound (Microphone + Buzzer)
- OPTION 2: Develop some aspect of the library TKJHAT SDK library:
- Integrate sensors interruptions, that is, enable and use interruptions in sensors.
- Provide some kind of utils for the library
- Improve efficiency of the existing library
- Please, discuss with the course staff if your idea is viable, and worth to implement.
Final meeting¶
See information in Tier 2.
In addition, members of the 2 groups collaborating should take consecutive meeting hours, so they can both do the final demo at the same time. However, the groups will present rest of the work separately
In addition, members of the 2 groups collaborating should take consecutive meeting hours, so they can both do the final demo at the same time. However, the groups will present rest of the work separately
Assessment¶
Check detailed assessment criteria in the Return Box for Tier 3.
Additional support¶
Detecting movement¶
This section might be of your interest if you decide to detect movement using the IMU.
Implementation of control commands as movements of the device, while holding it in hand, is freely decided by the group. Possible alternative implementations include at least the following:
- Sliding the device along the table surface
- Tilting and/or moving the device
- Bouncing the device (i.e., it occasionally leaves the table surface)
- Moving the device in the air without support, "3D" in the air.
General guidelines for interpreting control commands:
- It is advisable to determine how "strong" and how "long" in time one command is.
- For example, is the movement a small jerk or a slow movement lasting one second on the table surface in the chosen direction?
- When collecting data, it is important the "sampling rate": i.e., how often data is requested from the sensor.
- Example: once per second might be too slow an interval for the device's movement, but what if data is collected 5 or 10 times per second?
- Implementing a very "sophisticated" algorithm for interpreting control commands is not necessary. It is sufficient to analyze, for example, "changes in acceleration values as a function of time." See example data below.
- The exercise does not require a deeper understanding of mathematics than high school level, so the "quality of the recognition algorithm" for control commands is not evaluated. Of course, you can make the implementation as sophisticated as you want.
- However, the program must recognize movement commands in real-world test situations.
- "Chaining" control commands is okay, meaning that from continuous/long-lasting movement, multiple commands can be interpreted.
Interpreting a command¶
1. It is advisable to start the work by collecting your own dataset with the IMU sensor as a time series while manually moving the device. The collection should be done so that only one movement is made at a time and saved in its own file.
- It is important to include a timestamp in the measurement results to calculate the duration.
- The sensor provides six different measurement values per data query using our course library, which are:
- Acceleration on the x-, y-, and z-axes, measured in gravitational force "g"
- Gyroscope readings on the x-, y-, and z-axes, measured in "angular velocity"
When the data is collected in a table, the result is a 7-dimensional table. The data in the table might look something like this:
aika, acc_x, acc_y, acc_z, gyro_x, gyro_y, gyro_z
0, 0.0029, -0.0063, -1.0115, 1.1215, -0.4578, 0.3204
11, 0.0133, 0.0291, -0.9753, 0.0229, -0.0458, -0.2060
19, 0.0028, 0.0964, -1.0406, 5.2948, -14.09, 19.0735
30, -0.085, 0.2474, -1.8069, -1.833, 0.0992, 15.4953
41, -0.20, 0.3864, -2.2000, -6.180, 29.9924, 70.5643
Note that the figures are normalized with the gravity. Hence, 1 means 9.8 m/s2
2. The collected data can be visualized in relation to different axes to see what happened in the data during each movement/command execution, and thus understand how to implement recognition programmatically.
By visualizing, it is easy to see what the sensor data looks like at different stages of the command when the device moves:
- Is it enough to measure just the movement and its duration? How does the movement appear in the data?
- Does the command need to start and end in a specific position?
- How much "error" (e.g., movement in the direction of other axes) is allowed in the execution of the command?
- etc.
3. After collecting and visualizing the dataset, the control command can be recognized from the data quite simply:
- The duration of the movement in relation to different axes
- Exceeding/falling below a chosen threshold value in relation to an axis
- Various features can be calculated for the time series, such as the average, variance, standard deviation, etc., which can be used to distinguish commands from each other (and optionally fine-tune command recognition).
- This can also be useful if you want to chain commands or implement more complex commands (for those extra points?)
- Since sensor data is obtained on three axes (x, y, and z), it can also be treated as a 2- or 3-dimensional vector, which has, for example, direction and length.
- Anything you can find in a math book / online / taught in an AI course.
- Remember to mention sources in the code.
Ready-made test data¶
The staff has pre-collected a small test dataset of various movements of a similar device held in hand, which can be used for preliminary command recognition in the work. The methods mentioned here for moving the device are not necessarily the only, easiest, or best way to implement commands with the device.
The test data was collected using the SensorTag's MPU9250 accelerometer. In the files, the data format is
[timestamp, acc_x, acc_y, acc_z, gyro_x, gyro_y, gyro_z]. The data was measured so that the device was stationary at the beginning of the measurement and then moved up in two different ways. The images visualize the data for the x, y, and z axes (in different colors) over time for both the accelerometer (g) and the gyroscope (angular velocity).1. Sensor data when the device is stationary:
2. Sensor data when the device is jerked fast from the table surface to the direction up two times:
3. Sensor data, when the device is lifted up from the table surface and moved to the direction up:
Own test data¶
Collecting test data independently is easiest by first implementing a separate data collection task in the device as a separate program or task that simply collects movement sensor data into a table at selected intervals. Finally, the program prints the table row by row to the development environment, from where it can be easily copied and pasted into an editor and/or file. The data can be visualized using a Python library (hint hint, matplotlib) or a program like gnuplot or MS Excel.
When collecting your own test data, it is important to perform the movement always in the same way, i.e., the user and the sensor in the same position, etc., and absolutely only one movement at a time. This is so that the visualization clearly shows what happened in the data. Additionally, the data collected in different measurements is then comparable between different measurements. It is also noted that different people's hand movements are usually so different that, for example, the above test data does not directly apply to your own command recognition. The sensor is so precise that even small differences can be detected.
Tips for the project work¶
Not so surprisingly, many things are already told and many questions answered in the lecture materials. So, it is beneficial to study the materials very carefully..
The implementation of the project work should be designed with the help of Modular and procedural programming extra material. This way, even a big problem can be divided into parts that are easily manageable.
- Well planned is 90% done!
- A major mistake you can make is leaving things to the last minute. A beginner surely won't finish the work in a week!
- It's worth checking out the functionalities provided by the C language standard libraries and various RTOS libraries. Ready-made libraries greatly simplify programming, especially for tasks like string handling.
- When coding, it can be useful to code the difficult parts on a PC first. Testing motion detection algorithms and similar tasks on a PC is much easier. Once the code works, you can transfer the implementation to the device and test it there.
- Use intermediate print statements in loops, conditionals, etc., to print variable values or other essential information to the console window. This significantly speeds up debugging.
- A typical problem that can be easily solved with intermediate print statements is loop structures where you print different variable values to the console window in each iteration. Even at work, your boss is unlikely to appreciate you spending a lot of time figuring things out when adding debug information to the code would solve it quickly.
- It's important to remember that if you get stuck, ask!! The course staff is there to help you.
Project-related tips¶
We will update this section with additional tips when the implementation of the project work progresses and we know more..
- The program should be divided into tasks, such as one for communication, another for playing sounds, and another for reading sensor data, etc.
- Implement an Enable/Disable Command Recognition function in the user interface.
- Sensor data should be collected in a table, making it easy to handle as a time series.
- Note the size of the table in bytes where you are collecting data to ensure it fits in the device's memory!
- Sensor data can be cleaned if necessary (e.g., removing noise) by calculating a moving average or using another method.
- The measurement timestamp for test data can be obtained using the FreeRTOS function
xTaskGetTickCount() - The initialization of the device components should not be placed in the
mainfunction but at the beginning of the tasks. This is because the implementation should be task-based, and some components require RTOS functionality that starts only at the end ofmain. - The lecture material includes an example of a state machine, which should be applied in the exercise. A direct copy & paste to your project will not work.
- If you are planning to use the HAT with a battery (without a USB cable), the battery will eventually run out, especially if the device is accidentally left on. Implement a power button in the code. Please, note that the use of batteries with the device has not been tested yet.
FAQ¶
- Can the work be done individually?
- General answer is NO. If there are valid reasons for working alone, please contact the teacher. You might need to get your own hardware.
- Can the work be done in a larger group than a pair?
- The work can be done in a group of 2-3 people. In previous years, groups of three had to do additional tasks, but this practice is not in effect in autumn 2025. Groups of four are not allowed.
- The code worked at home, but not during the project evaluation. Can I try it on my own computer? Can I show it remotely from home?
- No. Make sure the code works in the development environment installed on the course virtual machines before submitting.
- Can I discuss the work with a friend?
- Of course, you can discuss it during exercises, social gatherings, etc., but if two groups submit the same code, you may be guilty of plagiarism. For this reason, groups that have collaborated should be mentioned in the code comments or similar.
- The line is that you can discuss the task, but you must write the code yourself!
- Can the work be submitted late?
- No.
- Can I use someone else's (or code found online)?
- You cannot use the code of those participating in the same course, even if you cite the source. In other words, you cannot copy code from someone else's project into your own.
- Otherwise, you can, as long as you indicate where the code was obtained from (website, etc.), so you do not commit plagiarism.
- Can I use AI?
- Yes, as long as you specify in your code what AI you used, what prompts you used, and how you modified the code based on the AI's response. If during the meeting we suspect that you have used AI and you do not have reported it your project might be rejected.
- What about plagiarism?
- The university has guidelines on how to deal with plagiarism. We follow them.
Temporary code sharing space¶
Use this return box to share code with your team members. DO NOT USE THIS RETURN BOX TO UPLOAD THE FINAL VERSION OF YOUR CODE. CODE IN THIS RETURN BOX WILL NOT BE EVALUATED