Projects
A collection of my recent projects
The main objective of this project was to design a viable, commercial use case for a social robot in a public space, such as a shopping centre. We were requested to use a 1.7m tall humanoid robot, called Chip (built by PAL Robotics), to create a supporting prototype and demonstrate our understanding of the unique challenges this emerging technology presents.
I designed a system for shopkeepers to schedule the robot Chip, to give away samples and survey customers using its touchscreen, under a retail setting. The system contains two components: one is the website called Chip for Hire for shopkeepers to book Chip as their products representative and manage their profiles, products, and surveys contents; the other is an application called Chip on Duty running on Chip's platform for interacting with customers during promotion activities.
Shopkeepers can register on the Chip for Hire website to hire Chip as their sales representative. They can manage the shop profile such as name, introduction, logo, and contact details. They can also create their product portfolio for promotion. Adding product photos is encouraged as Chip can display them on its screen for better marketing results. To get customer feedback, the website offers a survey builder where the shopkeepers can design a list of multiple-choice questions by themselves. The survey is also editable and sortable so that it is reusable for different marketing campaigns. Finally, once the products and surveys are ready, shopkeepers can check Chip's availability via the calendar on the homepage, and reserve a time slot (1 hour per activity). The reservation form asks shopkeepers to add the product from the existing portfolio and select a customised survey. In order to motivate customers to give feedback, shopkeepers can provide a special offer as a reward upon completion of the survey. Once the booking is confirmed, the shopkeeper who hired Chip will get an activation code via email and SMS to trigger the promotion activity.
Before the promotion activity, Chip will automatically start the application Chip on Duty and display the activation page on his touchscreen for the shopkeeper to input their code received previously via SMS. The activity promotion starts once the activation code is verified by the web server and activity content is loaded by Chip. During the activity, Chip can provide a live stream to the activity owner (shopkeeper), subject to the shopping centre agreement. Chip offers the product samples to customers, with a combination of speech and gestures, and asks them to participate in the survey. The customers who give feedback can use Chip's touchscreen to input their mobile number for special offers. Chip makes a request to the server with the mobile number and survey results from the customer. Subsequently, the server of Chip for Hire processes the request and makes another request to a cloud communication API which sends the special offer to the customer via SMS. After the activity ends, the shopkeeper is notified via email and SMS and the survey result is available instantly on the website.
The purpose of this project was to provide a user-friendly web interface for robot developers and researchers to access the Pepper robot and perform operations such as making the robot speak, driving the robot via a virtual joystick, monitoring the robot's status, and writing test scripts by using JavaScript. It used rosbridge_suite as the fundamental communication interface to ROS.
I also built an application called Pepper monitor that allows developers to show customisable content on the touchscreen mounted on the chest of Pepper robot. The application helps developers improve the experience of human-robot interaction when building robot applications, as communication via the touchscreen is more efficient and reliable. This application helped our RoboCup team UTSUnleashed win the best Human-Robot Interface award in the 2017 RoboCup@Home competition.
The aim of RoboCup@Home competition is to develop services and assistive robot technologies with a high relevance for future personal domestic applications. It is the largest international annual competition for autonomous service robots and is part of the RoboCup initiative.
Our team, UTSUnleashed, participated in the Social Standard Platform League from 2017 to 2019. We won the championship once (2019) and second place twice (2017, 2018). The robot used in the SSPL is the Softbank/Aldebaran Pepper. There are several tests in the competition, and I was responsible for the General Purpose Service Robot test and the Restaurant test.
In the General Purpose Service Robot test, the robot is required to solve multiple tasks upon request, which is not predefined. The actions to be carried out by the robot are chosen randomly by the referees from a larger set of actions, such as, "Pick up the banana from the cabinet, find Mary in the living room, and answer a question.".
In the Restaurant test, the robot is tested as a waiter in a real restaurant environment, which is not mapped before the test. The robot is requested to perform online mapping and locate the barman (the referee), then spot the customer who may wave or call, understand the order and recite the order to the barman.
My solution for the tests is based on the hierarchical state machine. The behaviours of the robot are broken into modules where all possible states and state transitions can be described explicitly. To understand the command from a human operator, we used Google Speech-to-Test to parse the received sentence into a list of actions according to our grammar configuration.
This project helped my lab's researcher conduct a social robotics experiment that used the Pepper robot in the Sydney Startup Hub. The study was to explore how to design a registration and login process that responsibly collects necessary private information with a social robot while adhering to privacy regulations and providing inclusive services.
The hub visitors were free to approach the robot, which was programmed to greet and engage with the users detected at approximately 1.2 meters from the robot camera. When a user was detected and greeted by the robot, the user was invited to take part in the study. The user can complete the registration process on the robot or online website and decide to use the QR code or face ID to login with the robot. The result of this experiment indicated that our social robot application was able to successfully maintain contextual integrity for the majority of participants.