This report was prepared for a class presentation in EDG6931 Human Computer Interation and the Learner, Spring 2014 at the University of Florida.

Authors & Evaluators: Cheryl Calhoun and James Nichols

Technology evaluated: Piazza is an online study network, which works something like a hybrid between a wiki and a discussion board. Posts are created in the form of notes or questions. Students work together to edit s single student answer for each question, while instructors work together to create a single instructor answer. Instructors can also edit questions and student answers to ensure the integrity of information presented matches the curriculum or course. In addition, Piazza provides the ability to host polls, post class resources, syllabi and other helpful information.

Target user group(s): Instructors who might be interested in using Piazza in their classes and students who are currently using Piazza for the first time in their Spring 2014 classes.

Four instructors were selected from a diverse group of faculty representing both Santa Fe and UF. Two instructors are IT instructors, one is Education and the fourth teaches Ethics. Three of the faculty have used online discussion forums previously, none of them had used Piazza prior to this evaluation.

The students are currently enrolled in CTS1131 at Santa Fe College. These students are using Piazza for the first time during the Spring 2014 semester. Twenty-six students participated in the evaluation survey.

Context: The instructor evaluations were conducted as one-on-one evaluations using Morae to record the sessions. After a brief introduction, the instructors were asked to create their instructor accounts, and then perform a set of seven tasks in Piazza. Following completion of the tasks, they were asked two open-ended questions about their experience and then administered a 10 question Likert scale survey.

Students participated in an online usability study survey administered in their classes via the Canvas Learning Management System. The survey was given during the fifth week of the semester after students had completed at least two assignments using Piazza. Some students completed the survey during on-campus optional class sessions, others completed the survey independently from home (or wherever they complete their course work).

This report is structured based on the DECIDE framework of usability testing (Rogers, Sharp, & Preece, 2001).

Determine the goals

Evaluation Goals

The primary goal of the project was to determine how well both instructors and students understand the function of Piazza and whether there are any issues with its usability. We also wanted to find out whether or not students and instructors saw any advantages to using Piazza as compared to the more traditional threaded discussion forums.

More specifically we focused on the following objectives.

For Instructors:

  • Determining whether instructors feel competent in
    • Creating their instructor accounts.
    • Creating a new post or note and posting in the related folders.
    • Editing a shared instructor or student post.
    • Finding posts either through search, or organizational structures.
    • Find the course analysis data useful.
  • Do instructors like using Piazza more or less than other online discussion forums?
  • Would they use Piazza in their classes?
  • What recommendations do they have for improvement?

For Students:

  • Determine whether students feel competent in
    • Adding posts.
    • Reading and navigating existing posts.
    • Create a shared student answer.
    • Finding posts based on keyword searches.
    • Setting user preferences, including notification preferences.
    • Reviewing class reports and statistics.
    • Finding endorsed questions.
    • Editing a single student answer wiki style.
    • Differentiating between content that should be added to the single students’ answer and content that should be added as a follow-up discussion.
    • Whether it is readily acknowledged users can edit other people’s posts.
    • Whether users would be more likely to edit other people’s post if it was anonymous.

About Piazza

Piazza is an online site where students can ask and answer questions 24 hours a day, 7 days per week. Piazza was founded by Pooja Sankar in 2009, while she was a student in Stanford’s business school, and launched in 2011. Currently there are over 1 million students using the site. The number of students grew 233% between 2011/2012 and 2012/2013 school years (Sankar, 2014). Piazza is appropriately named as a “public square or marketplace, especially in an Italian town” (Dictionary.com, LLC, 2014).

Piazza offers a wiki style discussion forum that allows students to work together to collaboratively edit one single student answer. Instructors can work together to edit one single instructor answer. By asking participants to work together collaboratively we are encouraging a higher level of analysis and synthesis of concepts in addition to building teamwork skills.

Using Piazza provides scaffolding for questions and answers especially for preparing to take certification questions as the concise answer to a question is listed.

User Goals

Instructors will use Piazza to support their students’ learning environment. They can create assignments and learning activities to encourage collaborative learning. To determine individual student performance, instructors will need to be able to determine how often students are posting.

Students will use Piazza to communicate with other students, clarify course requirements and help to learn and synthesize course concepts. Because of the ability to post anonymously, they can feel more comfortable asking questions they may not ask face to face, or in an identified discussion forum.

To be successful they will need to be able to:

  • Post in the discussion
  • Find answers in the discussion

Explore the Questions

We developed two separate evaluation processes. One to evaluate Piazza from the instructor perspective and the second to evaluate Piazza from the student perspective.

Instructor Usability Instructions & Questions:

  1. Go to: piazza.com/sfcollege/other/ec101
    1. Join EC 101: Evaluation Class as an Instructor.
    2. Use access code: ec101
    3. You will be directed to your email account to finish the sign up process.
    4. Spend a few minutes reviewing and browsing through the interface.
    5. Post an introduction note with you name and some background information about your career experience or goals in educational technologies. Feel free to post anonymously if desired. Add your introduction to the introductions folder.
    6. Find and read Question #1 and contribute to the shared instructors answer.
      1. Can you tell how many times Question #1 has been edited and by whom?
      2. How would you endorse this post to indicate it is a good question?
      3. Find and read “Tips & Tricks for a successful class”.
      4. Find and display the student introductions folder.
      5. Find and review the course statistics.
      6. Find and review course notification settings.
      7. Do you like using Piazza? Why/Why not?
      8. Do you like using Piazza more or less than other online discussions means? Why/Why not?
      9. What could improve Piazza? Why?
      10. Would you use this in your course? Why or why not.

Students Survey Questions:

In Piazza, how competent do you feel completing the following tasks?

  1. Add a post.
  2. Read and navigate existing posts.
  3. Create a student answer.
  4. Find posts based on keyword searches.
  5. Set user preferences, including notification preferences.
  6. Review class reports and statistics.
  7. Find endorsed questions.
  8. Edit a single student answer wiki style.
  9. Differentiate between content that should be added to the single students’ answer and content that should be added as a follow-up discussion.

Some other questions include:

  1. Would instructors want an ability to assess the quality of a student’s posts?
  2. What other ideas do students and instructors have for the system.

Choose Evaluation Methods

For instructors, we chose a do-it-yourself style of usability study as described in “Don’t Make Me Think” (Krug, 2014) and “Rocket Surgery Made Easy” (Krug, 2010). We recorded the usability study using TechSmith’s Morae, which allows us to record both the screen capture of the task completion and video of the instructor’s verbal comments and facial expressions during the evaluation. Instructors were given instructions for creating an instructor account, and a list of seven tasks to complete. In addition, they were asked several open-ended questions, which were recorded during the Morae session. They were asked to talk out loud as they completed the tasks so we could know what they are thinking and experiencing.

This method allowed us to get more subjective feedback from instructors such as perceiving how this system might enhance their classes. We were careful not to script the process too much so as not to overly influence their perceptions. By asking open-ended questions, and encouraging dialog during the evaluation we were able to capture opinions we may have missed in a more structured evaluation.

For students, we used an anonymous survey designed to evaluate their current feelings of competence in using the tool. With students we had to be careful about the questions we asked to ensure the responses are truly related to Piazza. Students are already resistant to participating in online discussions and group activities. That resistance has a potential to bleed over into their evaluation of any online discussion, or group based tool.

Pre-interaction Data Collection Methods

For instructors, we kept the pre-interaction data collection minimal. We only asked two questions designed to determine their previous experience with online discussion forums and Piazza. We chose to keep our pre-data collection to a minimum in order not to overly influence their perceptions of Piazza before the evaluation.

Pre-Interaction Questions:

  1. Have you used online discussion forums or wikis as a teaching tool? If so, what did you use? What did you like or dislike about those tools?
  2. Have you used Piazza before? How many times have you used Piazza?

For students, we simply administered the survey in our classes. These students had already been exposed to Piazza as they are using it in their classes for the first time this semester. They had already completed an assignment where they posted an introduction and had been engaged in answering questions and other activities on Piazza.

Data Collection Methods

For the Instructor usability study we used TechSmith’s Morae to record both the screen capture of task completion and a video of instructor’s reactions and responses to Piazza. Cheryl & James participated as facilitators in this process administering two usability studies each. During the study, the facilitators observed the instructors as they completed the tasks. Interaction was kept to a minimum and consisted mostly of encouraging instructors to verbalize their reactions. After all four usability studies were complete, we compiled the videos together for analysis using TechSmith Morae’s Manager program, which allows us to identify tasks and related data.

For the student survey, data was collected using Canvas’s Anonymous Survey tool. Data was then exported into Excel for analysis. Since the survey was posted in three sections, we used Excel to compile the data from all three classes into an overall summary report.

Post-interaction Data Collection Methods:

No post interaction data was collected for the student surveys. For the instructor usability studies we asked the following questions and recorded the instructor answers on the Morae video.

Instructor Post Evaluation Questions:

  1. Do you like using Piazza? Why/Why not?
  2. Do you like using Piazza more or less than other online discussions means? Why/Why not?
  3. What could improve Piazza? Why?
  4. Would you use this in your course? Why or why not.

Types of Data Collected

For the instructor evaluation, we collected four recorded evaluation sessions using Morae. In addition, Morae collects responses to a short post-task evaluation form. See Appendix B: Instructor Morae Testing Script.

For the student survey, we collected 26 responses from three sections of CTS1131.

Identify the Practical Issues

Design Issues

For the one-on-one instructor evaluation, we installed a 30-day evaluation version of Morae on two laptops in order to capture the usability session. We then had to schedule time with each of the volunteer instructors to administer the usability study.

For the student survey, we created a Likert scale survey administered in Canvas. We offered students time to complete the survey in class, if desired, and then left it posted it in the online Canvas course to encourage online students to participate in the survey.

Testing Issues

We initially had some issues with getting the Piazza accounts to work. Our instructor account is tied to Santa Fe and the signup process initially needed users to have an .sfcollege.edu email account. Cheryl contacted Piazza and they were able to fix the account so that users without an .sfcollege.edu account were able to create accounts.

On one of the laptops selected to administer the testing Morae would not recognize the built-in microphone. Fortunately, we had an additional USB microphone which we were able to install and use successfully.

During the first round of one-on-one testing, we discovered some of the language we had used in our questions, did not match the language in Piazza, so our tasks were difficult to follow. We fixed the language for the other three evaluations.

One instructor had difficulty with the Piazza login process. This was an issue because they were completing the evaluation on a computer that had been previously used. When typing in the URL, instead of finishing the URL as listed in the instructions, they clicked the first one that came up on the drop down menu. This happened to be the previous instructor’s login confirmation link, which logged them in as the previous instructor. This is an isolated issue caused by a combination of inexperience, and the configuration of the testing environment.

Procedure for Data Collection

For the instructor usability survey we asked four peer instructors if they would be willing to participate in a study of a new online discussion tool. We indicated the study would take about 30 minutes of their time and we were completing the study for a class in our PhD program at the University of Florida. We did not have to provide any incentives for their participation. All four of the usability studies were conducted during the week of March 17 through March 21, 2014.

For the student survey, we asked students to volunteer to participate. Again, we informed them we were completing the study as part of a class in our PhD program at the University of Florida. We also told them we could use the information learned during the study to improve the CTS1131 course and other courses using Piazza. The survey was administered in the CTS1131 Canvas courses over a two week period during February 2014.

Decide How to Deal with the Ethical Issues

For the instructor evaluation, we used a recording consent form (Appendix A). In addition, we used instructor 1 – 4 as identifiers in all analysis instead of actual participant names. All instructors are ones we know and were willing to provide some of their valuable time so convenience was a factor.

For the student evaluation, we conducted the online survey as an anonymous survey. Students were assigned an anonymous student number (e.g. Student 1, Student 2). In addition, students were allowed to complete the survey in an anonymous setting (home, office, etc.) if they wanted more privacy during their completion process.

Evaluate, Analyze, Interpret, and Present the Data

Data Analysis Methods

Do-it-yourself usability studies are qualitative methods, which are designed to help designers improve what they are building by identifying and fixing usability problems (Krug, Don’t Make Me Think, Revisited, 2014). While we aren’t the designers of this product, this was a good method for us to use because it gives us a good understanding of where issues might exist in the product we evaluated (Piazza).

In delivering the usability study, we started with one instructor, who is also an experienced researcher. This allowed us to test and elicit feedback on the process. We made some changes in questions, and follow up survey after the first pilot test.

For the student survey, we summarized the student responses and then generated an overall average score for each question.

Data and Results

Instructor Usability Testing Results (3 instructors surveyed)

1. I think I would like to use Piazza in my class.

1

2

3

4

5

Strongly Disagree

2

1

Strongly Agree

2. I found Piazza unnecessarily complex

1

2

3

4

5

Strongly Disagree

1

1

1

Strongly Agree

3. I thought Piazza was easy to use.

1

2

3

4

5

Strongly Disagree

1

1

1

Strongly Agree

4. I think that I would need the support of a technical person to be able to use Piazza.

1

2

3

4

5

Strongly Disagree

1

1

1

Strongly Agree

5. I would imagine that most students would learn to use Piazza very quickly.

1

2

3

4

5

Strongly Disagree

1

2

Strongly Agree

6. I found Piazza very cumbersome to use.

1

2

3

4

5

Strongly Disagree

2

1

Strongly Agree

6. I felt very confident using Piazza.

1

2

3

4

5

Strongly Disagree

2

1

Strongly Agree

7. I would need to learn a lot of things before I could get going with Piazza.

1

2

3

4

5

Strongly Disagree

1

2

Strongly Agree

Interestingly, the instructor who teaches outside of IT felt she would need the support of a technical person to be able to use Piazza but was neutral on Piazza being easy to use.

Time on Task Analysis

Tasks included in the time on task analysis include front page review, login, introduction post, editing question 1, finding Tips & Tricks, displaying introduction folder, course statistics, and course notification. Results are displayed in Figure 1: Time on Task Analysis.

The lengthiest task is the editing of the shared instructor answer on question #1, this is partially because some instructors spent more time browsing the interface than others. This is also the place where they were stumped by needing to edit the single instructor answer. Instructor 2 spent 11.79 minutes on this task. Much of the time was spent reviewing the history and trying to figure out how to display each instructor’s contribution.

Besides, the login process, the next lengthiest task was posting an instructors introduction. This again provided some difficulty in that several instructors posted their introduction as a follow up post to an existing introduction. It seemed they did not understand how to create a new post as their introduction.

All instructors were able to find tips and tricks relatively easy, three of the instructors found the course statistics relatively easy, only one of the instructors actually found the e-mail notification settings.

Time-on-Task-2bjsv9l

Student Usability Survey Results (26 students surveyed)

In Piazza, how competent do you feel completing the following tasks?I have no ideaNot CompetentSomewhat CompetentUncertainCompetentHighly CompetentTotalScore
Add a post.12%12%4%8%62%4%263.1
Create a student answer.8%19%8%4%46%15%263.1
Set user preferences, including notification preferences.8%8%19%12%46%8%263.0
Read and navigate existing posts.12%12%12%12%38%15%263.0
Edit a single student answer wiki style.12%8%19%8%46%8%262.9
Find posts based on keyword searches.12%15%19%4%38%12%262.8
Review class reports and statistics.12%12%15%12%50%0%262.8
Differentiate between content that should be added to the single students’ answer and content that should be added as a follow-up discussion.15%19%15%8%35%8%262.5
Find endorsed questions.27%8%4%27%31%4%262.4

Recommendations for Improvement

  • Make folders more visible on the page. One possible improvement could be when hovering over folders have a darker border appear around them. Once selected, they could be filled in with the dark color. This was a problem on the New Post. One of the “Eight Golden Rules of Interface Design” is offering informative feedback (Shneiderman & Plaisant, 2005), which is clearly violated by providing the same feedback for both hovering and selection.
  • When signing up for a class, it is confusing to have both Join Class and a separate box below requesting the user to provide email address. When clicking the “Join Class” button, it doesn’t seem to do anything.
  • One user searched for help about notification settings but didn’t find it useful.
  • “Tips and Tricks” page causes the user to have to scroll up and down when reaching the bottom tips and tricks. This contradicts Fitts’ Law (Fitts, 1954) which says not having to move a larger distance makes it faster to reach the intended target.
  • Track additional data on student usage
    • Allow click through to individual student participation from course statistics page.
    • If instructor deletes a post, remove post from student activity counts. Otherwise a student could receive lots of credit for having inappropriate posts an instructor deletes.
    • Include some sort of tutorials to help students understand the collaborative nature of wiki-style editing.

Reflections and Findings

For the most part, both students and instructors are able to fairly easily find their way around Piazza and utilize the basic functionality of the program. There are a few features that may require a little more instruction. These might include using the search feature, browsing through folders, finding endorsed questions, etc.

The hardest part of Piazza is to understand the collaborative wiki-style of editing. This is true for both the student and the instructor. This tool has the potential of elevating the learning capabilities of a discussion forum, however, it will take both instructor and student’s learning how this new tool works. Students are highly reluctant to edit another person’s work. Instructors need to think strategically about how to develop activities eliciting deeper analysis and thought processes.

The instructor not involved in IT asked, “what is the difference between Piazza and Google Docs?”.

Being able to collaborate is a benefit which several instructors said, which according to Goodyear, et all is increasing and would seem to include Piazza as a “Networked learning”, since the collaborations are all online with little face to face communication (Goodyear, Jones, & Thompson, 2014). One instructor further stated it is nice the instructor can refine responses to questions as this would save time. An instructor also thought shy students would probably use it more.

One instructor mentioned “students like to know other students are struggling with the same material”, which helps to accomplish one of Piazza’s goals of collaboration and cooperation.

One instructor said she likes being able to spend less time typing “Good question” to give encouragement as can just click on “Good question” or “Good answer”. Also no delay in having to come to class to tell students “This question came up multiple times in email.”

A shortcoming of Morae would be to allow for recording when the survey is being filled out to capture any comments as well as delay in answering specific questions to allow for better follow-up.

References

Dictionary.com, LLC. (2014). Piazza. Retrieved from Dictionary: http://dictionary.reference.com/browse/piazza

Fitts, P. (1954). The information capacity of the human motor system in controlling aplitude of movement. Journal of Experimental Psychology, 381-391.

Goodyear, P., Jones, C., & Thompson, K. (2014). Computer-Supported Collaborative Learning: Instructional Approaches, Group Processes and Educational Designs. Handbook of Research on Educational Communications and Technology, 439-451.

Krug, S. (2010). Rocket Surgery Made Easy. Berkley: New Riders.

Krug, S. (2014). Don’t Make Me Think, Revisited: A Common Sense Approach to Web and Mobile Usability. Berkley, CA, USA: New Riders.

Rogers, Y., Sharp, H., & Preece, J. (2001). Interaction Design. West Sussex: John Wiley & Sons.

Sankar, P. (2014, February 26). Site wants to be LinkedIn for top ten tech talent. (L. Claman, Interviewer)

Shneiderman, B., & Plaisant, C. (2005). Designing the User Interface: Strategies for Effective Human-Computer Interaction. (Fourth, Ed.) Boston: Pearson Education, Inc.

Appendix A: Recording Consent Form

Appendix B: Instructor Morae Testing Script

Piazza Usability Testing (with Morae)

Setup: add participants e-mail to Piazza instructor access list.

Welcome & Questions (6 minutes):

Before beginning, review the process with the participant. Assure them the purpose of the evaluation is to test the system, there will be no evaluation of their performance. Ask them to talk outboard as much as possible during the review so we can capture their thoughts and impressions.

  1. Have you used online discussion forums or wikis as a teaching tool? If so, what did you use? What did you like or dislike about those tools?
  2. Have you used Piazza before? How many times have you used Piazza?

Home Page Tour (3 minutes):

  1. Go to: piazza.com. Review page content including the “1-minute Founder Video”.

Begin the Tasks (15 minutes):

  1. Go to: piazza.com/sfcollege/other/ec101
    1. Join EC 101: Evaluation Class as an Instructor.
    2. Use access code: ec101
    3. You will be directed to your email account to finish the sign up process.
    4. Spend a few minutes reviewing and browsing through the interface.
    5. Post an introduction note with you name and some background information about your career experience or goals in educational technologies. Feel free to post anonymously if desired. Add your introduction to the introductions folder.
    6. Find and read Question #1 and contribute to the shared instructors answer.
      1. Can you tell how many times Question #1 has been edited and by whom?
      2. How would you endorse this post to indicate it is a good question?
      3. Find and read “Tips & Tricks for a successful class”.
      4. Find and display the student introductions folder.
      5. Find and review the course statistics.
      6. Find and review course notification settings.

Wrap Up (5 minutes)

  1. Do you like using Piazza? Why/Why not?

Do you like using Piazza more or less than other online discussions means? Why