Based on our observations, we noted that participants of presentations canât express the same non-verbal cues as in in-person presentations. For instance, often in video presentations, raised hands go unnoticed. In general, participants struggle to express information that would otherwise be expressed through body language or facial expression. This limits the way presenters and participants can engage with each other and especially dampens dialogue. Most software for video conferencing does not have many features to encourage other forms of communication and participation in presentation, and those that do have these features buried behind menus and drop downs.
Our goal is to create a system that will help participants and presenters interact more casually during video-conferencing. By providing the ability for participants to âemoteâ during presentations, then aggregate and display this information to everyone, we hope to increase participation. This feature will support more non-disruptive forms of communication for participants and provide presenters with easy to interpret statistics of participant reactions.
Here are the ideas that were generated in our first 10 plus 10 brainstorming session. First, we didnât constrain the technology feasibility and the layout of our possible designs, rather we explored all possible design options.
After this session, we settled on a single column interface which doesnât incorporate video conferencing tools. We wanted our system to be used by anyone, no matter their choice of video-conferencing platform. From that, we generated many ideas about the style and format of the interface, namely where key features like raising hand or emoting would go. Here are the ideas we came up with:
We bounced our ideas off each other with sketches and conducted
usability testing with our paper prototypes. For the usability
testing, we ran the wizard of Oz testing as below:
| Task | Related Usability goals |
|---|---|
| Let the presenter know youâre following | 1,2,3,4 |
| Send a message in the chat | 1,2,4 |
| Add yourself to the question queue | 1,2,3,4 |
| Remove yourself from the question queue | 1,2,3 |
| Answer to polling question | 1,2,3,4 |
| View polling results on bar chart | 1,2,3 |
| Check which students are connected | 1,2 |
| Give the presenter a love-reaction | 1,2,3,4 |
| Task | Related Usability goals |
|---|---|
| Assess how students are following along | 1,2,3,4 |
| Check which reaction has the highest count | 1,2,3,4 |
| Look up who is next in the question queue and delete them from the queue | 1,2,3 |
| See multiple choice answers | 1,2,3 |
As planned, we conducted the wizard of Oz testing with our paper prototypes and recorded all sessions. However, for privacy reasons we had to remove such videos from this publicly available website.
Subject A is a student at HEC. She works at Desjardins and uses video conferencing daily for her work and school. She feels comfortable using video conferencing tools and participates in her online class during lectures when asked to.
| Task | Reached? | Time taken | Observations |
|---|---|---|---|
| Let presenter know youâre following | N | 2 secs | Participant used an unintended feature First the participant tried to type the message in a chat box. And then they clicked one of the emojis. The participant did not use the âemotion barâ here. |
| Send a message on the chat | Y | 2 secs | . |
| Add yourself in the question queue | Y | 2 secs | . |
| Remove yourself from the question queue | Y | 2 secs | . |
| Answer the poll | Y | 2 secs | . |
| View the polling result | Y | 2 secs | The participant knew where to check this right away.
However, when asked to check the bar graph version of the results: user paused to think about it (~5sec). She first tried to emote, unsuccessfully, then she noticed the toggle button and clicked on it. When she figured out what that button was doing, it took her 2 secs to switch back and forth between views. She had learned the feature. |
| Check who are connected | N | 10 secs | User got confused, took a long time to reach task
The user spent time looking for the feature. Incorrectly tried to emote first, then tried the chat. The participant seems confused. Finally clicked the âpeopleâ button after ten seconds. |
| Emote love emoji | Y | 2 secs | . |
Are there any additional features you wish Relier had?
âA private chat to talk to my friendsâ
Were there any parts of the user interface you found confusing?
âAt first, I was unsure where to check for the list of participants (looked at the emoji counts, but then found the button next to the chat)â
Any other feedback you'd like to provide us with?
âI think presenters that decide to use this app should let participants know at the beginning of the presentation because some of these features are already present in video-conferencing platforms and it could be confusing for them if people start reacting on both platforms.â âAlso, I think the âlevel of understandingâ distribution should reset periodically in order to avoid confusion for the presenter if they clarify a concept and some participants forget to move their cursor back to the middle â
(Note that we already had this in mind, but since we were testing the design on paper this was not shown to the participants)
The subject B is a developer. He uses the video conferencing software 4 to 6 times a week for work. He delivers a presentation during the code reviews, daily meetings and bug fixing sessions. He is very comfortable using videoconferencing tools. Participants of his presentations engage during the talk only when they have a question. His participants use webcam, voice chat and text chat, but do not raise their hand and or use emotes (ex. âyesâ ânoâ âclapâ etc. in Zoom)
| Task | Reached? | Time taken | Observations |
|---|---|---|---|
| Assess how students are following along upon looking at the student understanding graph | N | 10 secs | Graph was initially hard to interpret When the graph was flat, the subject was confused and didnât seem to know how to interpret it. He intendedly clicked on the emoji to gain more context. Once the graph changed however, he understood how to interpret the graph. |
| Checks the emote with the highest count | Y | 5 secs | . |
| Sees who the next participant is in the question queue | Y | 2 secs | . |
| Remove yourself from the question queue | Y | 2 secs | . |
| Deletes person who just asked a question | N | Never done | We forgot the button in our prototype |
| Ask a multiple choice question and check the answer with the highest count | Y | 2 secs | . |
| Check people who is connected | N | 2 secs | Participant used an unintended feature
User knew that he had to look at the chat but did not click on the âPeopleâ button. He suggested that he would send a message on the chat and wait for responses in order to accomplish this task |
Subject B stated that they would like to see the number of people expressing âthe level of understandingâ. They like the bar graph of emoji results with numbers displayed. The subject B likes that the distribution of the level of understanding is real-time as they can always check if participants are following their presentation. Overall, it is good that all results are anonymous. The subject believes that this software is especially useful for introverted participants.
A few improvements became clear to us upon the data analysis:
Both the participant and presenter had issues initially using and interpreting the level of understanding graph. This is understandable, given the lack of titles, subtitles, and labelling around the graph. To mitigate confusing, we hope to include a title, x and y axis labels, and other text around the graph.
Another source of confusion for both our participant and presenter was the inability to find the toggle to display which users are online. We believe we can fix this issue by making the button to see online clearer, by changing its name from âPeopleâ to âWhoâs onlineâ.
One of the issues we encountered was an oversight. In our prototypes, we forgot to include the âclear queueâ button and X beside the names of participants in the queue to remove them. We will add those buttons in our next prototype.
One idea that we are considering is including an overlay tutorial upon opening the website for the first time. This tutorial would be no more than three frames that highlight one feature at a time and provide a short description of what it is (ex. Highlight the emoji panel and include the text âEmote here and see otherâs emotes!â).
As one of our users requested, we would like to add a private chat feature where users have the choice to send a message to all other users or to one in particular. We liked this idea because we hope it will help shy and introverted users more comfort in participating by, for instance, asking the presenter a question.