top of page

FlipAIoT

Education App

Background psd created by zlatko_plamenov

FlipAIoT Education App

UX/UI Design | 2019

FlipRobot is a company that provides robotic STEAM learning solution for the in-class environment. And this App is one of the Apps that goes along with their curriculum and also took part in the "OneWorld Robotics Competition(OWRC)."

The main subject of the curriculum is "AI" and "IoT." There will be topics such as "Autopilot Car" that involves vision recognition, or "Unmanned Stores" that uses speech recognition, etc.​ Five functions will be included in FlipAIoT, as shown below. I will be introducing "Vision Recognition" and "Cause and Effect."

folio_web_flipaiot_01.png

Feature 1

Vision Recognition

"Vision Recognition" was designed for children and teenagers to understand the process of AI training in a simple way. They won't be learning how to write an algorithm but the idea that you can "teach" a machine or a model.

Look at the picture on the right. To start using Vision Recognition by adding examples to your database, and then examine whether you have enough examples for the algorithm to recognize. There'll be a percentage of confidence shown it the interface. If the percentage does not show a satisfying result, consider to add more examples or adjust your model by reset the training.

Now, let's suppose that the robot is able to recognize a series of images you have showed it for training. If you want to ask it to do something more, switch to "Cause and Effect."

folio_web_flipaiot_02.png

Flowchart of "Vision Recognition" to "Cause and Effect"

User Interface Design

wireframe_v1-1.png

1. Wireframe

Before getting down to drawing wireframe, the team and I took "Teachable Machine" of Google as a reference. I make sure that the whole process and interface are as simple as "Teachable Machine," so the user can complete the training process within few clicks. Moreover, to understand what "training" means more easily.

Feature 2

Cause & Effect

"Cause and Effect" offers a graphical programming approach that helps you visualize every aspect of your idea for the robot, including measurement data, and trigger the robot actuator. This visualization makes it simple to represent complex logic on the diagram, develop applications of the robot.

For example, if you want your robot to turn right when sensing an object in front of it, you can add:

  • An ultrasound sensor: to detect the distance between two objects

  • Logic and a constant: within what range

  • An action: how you want the robot to move
     

Take a look at the image on the right. Now your robot can follow the instructions and do what you want it to do!

folio_web_flipaiot_03.png

Example of "Cause and Effect"

User Interface Design

Reflection

Push it through the limit

This project took our team on a long and uncertain journey as we ventured into the development of something entirely unfamiliar. Personally, I was a complete novice when it came to AI, let alone designing an app and instructing it. With a mix of nerves and excitement, I embarked on the mission of making AI learning more accessible.

The most valuable aspect of this endeavor was the opportunity to build everything from the ground up. Working alongside my colleagues, including curriculum designers and software engineers, has been a fulfilling experience. We pushed beyond our defined roles, relentlessly seeking answers that seemed distant but remained eager to uncover them. While our progress may be just a small step in the expansive realm of AI learning, I believe I've gained more knowledge and insight from this project than I ever imagined.

UI Design of FlipAIoT
OWRC Regional Competition in ANZ
OWRC User Test 2
OWRC User Test 1
bottom of page