Research Methods


Prof. Debaleena Chattopadhyay
Email: debchatt
Personal Webpage:
Lab Webpage:

Course Information

Credit Hours 4
Class Time Monday & Wednesday, 4:30pm — 5:45pm
Class Location ERF 2068 [map]
Instructor's Office 1135 SEO    [map]
Instructor's Office Hours Monday & Tuesday, 11am — 12pm
Link to Piazza

Course Description

In this course, students will become broadly familiarized with the state of the art empirical methods in human-centered computing (HCC).


This course will be accessible to students with a wide range of backgrounds in different areas of computer science. Students are required to have a core research area (e.g., security, data science, NLP, or visualization), in which they have taken and passed core courses. Students that are currently research active are strongly encouraged. A mere interest in an area with no familiarity with its core concepts and research questions is not sufficient. CS + X students are welcome to take this course. Requirement of a core research area still applies, which should be of CS. Examples of X include communication, health, learning sciences, or urban planning. Some familiarity with statistics will be assumed.

General Guidelines to Use the Syllabus

Students are responsible for familiarizing themselves with the syllabus. The instructor is responsible for being responsive to the diverse needs of the enrolled students and making necessary modifications to this syllabus, which is to be treated as a living document.

Blackboard will be used to post course grades. Submit individual assignments via Blackboard.

Piazza will be used to post announcements, FAQs, discussion points, course-related general queries, and detailed instructions when deemed necessary.

The course webpage will contain links to readings, general instructions, and class schedule.

Email policy: Post general doubts publicly on Piazza and private matters anonymously (to classmates). The instructor will respond on Piazza. Except extenuating circumstances and emergencies, students should strictly avoid using emails for class-related doubts/inquiries/information.

Students are responsible to keep themselves updated with class announcements.

Learning Objectives

Course Overview: If a research project involves building human-centered computing (HCC) applications or understanding human-computer interactions, then to assess project success human-subject studies are a must. More and more, we are building computing tools that directly interface with humans—to improve student learning, benefit group work, create a healthy habit, provide better recommendations, crowdsource our next trip, protect our sensitive data from malicious websites, or gather critical insights from a big data visualization. How do we know that these computing systems are effective? What evidence do we have that the research project was successful? How do we conduct a meaningful assessment? What are the appropriate metrics? How can we improve the system? To answer all these questions, an in-depth understanding of the state-of-the-art empirical methods in HCC is essential.

Students will learn how to use human subject data to (1) systematically evaluate human-computer interactions and (2) provide insights to inform the design of human-centered computing systems. This course will cover a range of empirical methods, such as experiment design, hypothesis testing, log analysis, grounded theory, and crowdsourcing, which will train graduate students to critically examine the implications of human-computer interactions in different CS areas such as security, machine learning, data science, or visualization. Both qualitative and quantitative methods will be covered. Emphasis will be given on understanding the state of the art of these methods, and on developing intuition about which methods are appropriate in various application contexts.

Upon successful completion of this course, students will be able to (somewhat independently) design and (reasonably) conduct human subject experiments in any area of computing.

Course Texts


Ways of Knowing in HCI
Judith S. Olson and Wendy A. Kellogg
Springer 2014

Available online from the UIC library


Discovering Statistics Using R
Field, Miles, & Field
SAGE 2012
Human-Computer Interaction: An Empirical Research Perspective
I. Scott MacKenzie
Morgan Kaufmann 2013
The Basics of Social Research
Earl R. Babbie
Cengage Learning 2011
Contextual Design: Design for Life (Interactive Technologies)
Karen Holtzblatt
Morgan Kaufmann 2016
Basics of Qualitative Research
Juliet Corbin & Anselm L. Strauss
SAGE 2015
Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan
John K. Kruschke
Academic Press / Elsevier 2015

Class Details

Class meetings will include a mix of lectures, student presentations, paper discussions, and project discussions.

Sign up for paper presentations

Research topics for additional readings:

  • Collaborative data visualization with VR and AR
  • Visual analytics
  • Learning analytics
  • Health
  • Distributed team collaboration
  • Social network analysis
  • Collocated collaboration
  • Virtual digital characters
  • Gestural interaction
  • Wearables for environmental sensing
  • Participatory design with older adults


Week Topic Readings Deliverables
Week 1
Aug 27
Introduction to Empirical Methods
General Introductions
Aug 29 Structuring the Research Inquiry



  • The Basics of Social Research, Earl R. Babbie. Ch 4: Research Design
  • The Basics of Social Research, Earl R. Babbie. Ch 5: Conceptualization, Operationalization, and Measurement
(link to Babbie readings via UIC login)

Aug 31: research intro due
Week 2
Sep 3
Labor Day, no class
Sep 5 Grounded Theory Method


  • Textbook chapter — Curiosity, Creativity, and Surprise as Analytic Tools: Grounded Theory Method
  • Baumer, E. P., Mimno, D., Guha, S., Quan, E., & Gay, G. K. (2017). Comparing grounded theory and topic modeling: Extreme divergence or unlikely convergence?. Journal of the Association for Information Science and Technology, 68(6), 1397-1410.


  • Conlen, M., Stalla, S., Jin, C., Hendrie, M., Mushkin, H., Lombeyda, S., & Davidoff, S. (2018). Towards Design Principles for Visual Analytics in Operations Contexts. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 138). ACM.
  • Lee, S., Kim, S. H., Hung, Y. H., Lam, H., Kang, Y. A., & Yi, J. S. (2016). How do people make sense of unfamiliar visualizations? A grounded model of novice's information visualization sensemaking. IEEE transactions on visualization and computer graphics, 22(1), 499-508.
  • Yardi, S., & Bruckman, A. (2011). Social and technical challenges in parenting teens' social media use. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 3237-3246). ACM.
  • Myneni, S., Cobb, N. K., & Cohen, T. (2013). Finding meaning in social media: content-based social network analysis of QuitNet to identify new opportunities for health promotion. In MedInfo (pp. 807-811).
  • Chini, J. J., Straub, C. L., & Thomas, K. H. (2016). Learning from avatars: Learning assistants practice physics pedagogy in a classroom simulator. Physical Review Physics Education Research, 12(1), 010117

IRB training due
Week 3
Sep 10
student presentations
  • Conlen, M., Stalla, S., Jin, C., Hendrie, M., Mushkin, H., Lombeyda, S., & Davidoff, S. (2018, April). Towards Design Principles for Visual Analytics in Operations Contexts. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 138). ACM.
    [slides] — presented by Harish
Sep 12 Experimental research in HCI (parametric)


  • Textbook chapter — Experimental Research in HCI
  • Discovering Statistics Using R, Field, Miles, & Field. Ch 2.
(link to Field readings via UIC login)


  • (If new to R) Discovering Statistics Using R, Field, Miles, & Field. Ch 3 & Ch 4.
  • Zheng, J., Bi, X., Li, K., Li, Y., & Zhai, S. (2018). M3 Gesture Menu: Design and Experimental Analyses of Marking Menus for Touchscreen Mobile Interaction. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 249). ACM.
  • Mäkelä, V., Khamis, M., Mecke, L., James, J., Turunen, M., & Alt, F. (2018). Pocket Transfers: Interaction Techniques for Transferring Content from Situated Displays to Mobile Devices. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 135). ACM.

Week 4
Sep 17
student presentations
  • Bickmore, T. W., Silliman, R. A., Nelson, K., Cheng, D. M., Winter, M., Henault, L., & Paasche‐Orlow, M. K. (2013). A randomized controlled trial of an automated exercise coach for older adults. Journal of the American Geriatrics Society, 61(10), 1676-1683.
    [slides] ——presented by Tengteng
  • Mäkelä, V., Khamis, M., Mecke, L., James, J., Turunen, M., & Alt, F. (2018, April). Pocket Transfers: Interaction Techniques for Transferring Content from Situated Displays to Mobile Devices. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 135). ACM.
    [slides] ——presented by Pantea
Sep 19 Experimental research in HCI (non-parametric)


  • Discovering Statistics Using R, Field, Miles, & Field. Ch 14: Non-Parametric tests.
(link to Field readings via UIC login)


  • Piumsomboon, T., Lee, G. A., Hart, J. D., Ens, B., Lindeman, R. W., Thomas, B. H., & Billinghurst, M. (2018). Mini-Me: An Adaptive Avatar for Mixed Reality Remote Collaboration. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 46). ACM.
  • Gugenheimer, J., Stemasov, E., Sareen, H., & Rukzio, E. (2018). FaceDisplay: Towards Asymmetric Multi-User Interaction for Nomadic Virtual Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 54). ACM.

Week 5
Sep 24
student presentations
Sep 26 Social Network Analysis
Guest lecture: Prof. Tanya Berger-Wolf
(Debaleena at Grace Hopper)

[slides from Prof. Tanya Berger-Wolf]
[more slides]


  • Textbook chapter — Social Network Analysis in HCI
  • Fisher, D., & Dourish, P. (2004). Social and temporal structures in everyday collaboration. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 551-558). ACM.
  • Choudhury, T., & Pentland, A. Sensing and Modeling Human Networks using the Sociometer. (2003). In Proceedings of the Seventh IEEE International Symposium on Wearable Computers (ISWC’03) (Vol. 1530, No. 0811/03, pp. 17-00). link to paper


  • de-Marcos, L., García-López, E., García-Cabot, A., Medina-Merodio, J. A., Domínguez, A., Martínez-Herráiz, J. J., & Diez-Folledo, T. (2016). Social network analysis of a gamified e-learning course: Small-world phenomenon and network metrics as predictors of academic performance. Computers in Human Behavior, 60, 312-321.
  • Pirker, J., Lesjak, I., Punz, A., & Drachen, A. (2018). Social Aspects of the Game Development Process in the Global Gam Jam. In Proceedings of the International Conference on Game Jams, Hackathons, and Game Creation Events (pp. 9-16). ACM.
  • Dong, W., Ehrlich, K., Macy, M. M., & Muller, M. (2016). Embracing cultural diversity: Online social ties in distributed workgroups. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (pp. 274-287). ACM.

research proposal due
Week 6
Oct 1
student presentations
  • Dong, W., Ehrlich, K., Macy, M. M., & Muller, M. (2016, February). Embracing cultural diversity: Online social ties in distributed workgroups. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (pp. 274-287). ACM.
    [slides] ——presented by Andrew
  • Wu, L., & Liu, H. (2018, February). Tracing fake-news footprints: Characterizing social media messages by how they propagate. In Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining (pp. 637-645). ACM.
    [slides]——presented by Sourabh
Oct 3 Log data and analysis


  • Textbook chapter — Understanding User Behavior Through Log Data and Analysis


  • Boyd, L. E., Gupta, S., Vikmani, S. B., Gutierrez, C. M., Yang, J., Linstead, E., & Hayes, G. R. (2018). vrSocial: Toward Immersive Therapeutic VR Systems for Children with Autism. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 204). ACM.
  • Homaeian, L., Goyal, N., Wallace, J. R., & Scott, S. D. (2018). Group vs Individual: Impact of TOUCH and TILT Cross-Device Interactions on Mixed-Focus Collaboration. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 73). ACM.
  • Yoon, D., Chen, N., Randles, B., Cheatle, A., Löckenhoff, C. E., Jackson, S. J., ... & Guimbretière, F. (2016). RichReview++: Deployment of a Collaborative Multi-modal Annotation System for Instructor Feedback and Peer Discussion. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (pp. 195-205). ACM.

Week 7
Oct 8
student presentations
Guest moderator: Prof. Chris Kanich
(Debaleena at UbiComp)
  • Bauer, A., Flatten, J., & Popovic, Z. (2017). Analysis of problem-solving behavior in open-ended scientific-discovery game challenges. In Proceedings of the 10th International Conference on Educational Data Mining.
    [slides]——presented by Aditi
  • Dietsch, D., Podelski, A., Nam, J., Papadopoulos, P. M., & Schäf, M. (2013). Monitoring student activity in collaborative software development. arXiv preprint arXiv:1305.0787.
    [slides]——presented by Jesse
Oct 10 Debaleena at UbiComp, no class Annotated bibliography due
Week 8
Oct 15
Mid-term project presentations & peer review
Oct 17 Mid-term project presentations & peer review
Week 9
Oct 22
Project discussions
Oct 24 Ethnography & Contextual Inquiry


  • Textbook chapter — Field Deployments: Knowing from Using in Context
  • Textbook chapter — Reading and Interpreting Ethnography


  • Chattopadhyay, D., Ghahari, R. R., Duke, J., & Bolchini, D. (2016). Understanding advice sharing among physicians: towards trust-based clinical alerts. Interacting with Computers, 28(4), 532-551.
  • Dourish, P. (2006, April). Implications for design. In Proceedings of the SIGCHI conference on Human Factors in computing systems (pp. 541-550). ACM.
  • Chilana, P. K., Wobbrock, J. O., & Ko, A. J. (2010, April). Understanding usability practices in complex domains. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2337-2346). ACM.
  • Dillahunt, T. R., Bose, N., Diwan, S., & Chen-Phang, A. (2016, June). Designing for disadvantaged job seekers: Insights from early investigations. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems (pp. 905-910). ACM.

Peer reviews due
Week 10
Oct 29
student presentations
  • Dourish, P. (2006, April). Implications for design. In Proceedings of the SIGCHI conference on Human Factors in computing systems (pp. 541-550). ACM.
    [slides] —— presented by Harish
  • Eslambolchilar, P., Bødker, M., & Chamberlain, A. (2016). Ways of walking: understanding walking's implications for the design of handheld technology via a humanistic ethnographic approach. Human Technology, 12.
    [slides] —— presented by Nina
Oct 31 Survey research in HCI


  • Textbook chapter — Survey Research in HCI
  • Discovering Statistics Using R, Field, Miles, & Field. Ch 17.
  • Hox, J. J., & Bechger, T. M. (1998). An introduction to structural equation modeling [link]


  • Structural Equation Modeling With the sem Package in R. [link]
  • Rosseel, Y. (2012). Lavaan: An R package for structural equation modeling and more. Version 0.5–12 (BETA). Journal of statistical software, 48(2), 1-36. [link]
  • Chattopadhyay, D., & MacDorman, K. F. (2016). Familiar faces rendered strange: Why inconsistent realism drives characters into the uncanny valley. Journal of vision, 16(11), 7-7.
  • O'Brien, H. L., & Toms, E. G. (2010). The development and evaluation of a survey to measure user engagement. Journal of the American Society for Information Science and Technology, 61(1), 50-69.
  • Fensli, R., Pedersen, P. E., Gundersen, T., & Hejlesen, O. (2008). Sensor acceptance model–measuring patient acceptance of wearable sensors. Methods of information in medicine, 47(01), 89-95.
  • Shin, D. H. (2013). Defining sociability and social presence in Social TV. Computers in human behavior, 29(3), 939-947.
  • Webster, J., Trevino, L. K., & Ryan, L. (1993). The dimensionality and correlates of flow in human-computer interactions. Computers in human behavior, 9(4), 411-426.

Week 11
Nov 5
student presentations
  • Lindgren, R., Tscholl, M., Wang, S., & Johnson, E. (2016). Enhancing learning and engagement through embodied interaction within a mixed reality simulation. Computers & Education, 95, 174-187.
    [slides] --presented by Aditi
  • O'Brien, H. L., & Toms, E. G. (2010). The development and evaluation of a survey to measure user engagement. Journal of the American Society for Information Science and Technology, 61(1), 50-69.
    [slides] --presented by Jesse
Nov 7 Introduction to Meta Analysis [slides ]


  • Introduction to Meta-Analysis. Borenstein, Michael. Part 1 —4, pp. 3 —85. [ebook available at UIC library]


  • Meta-Analysis with R. Schwarzer, Carpenter,Rücker, and Gerta. Ch 1 and Ch 2. [ebook available at UIC library]
  • Abedtash, H., & Holden, R. J. (2017). Systematic review of the effectiveness of health-related behavioral interventions using portable activity sensing devices (PASDs). Journal of the American Medical Informatics Association, 24(5), 1002-1013.
  • Tsoi, K. K., Chan, J. Y., Hirai, H. W., Wong, S. Y., & Kwok, T. C. (2015). Cognitive tests to detect dementia: a systematic review and meta-analysis. JAMA internal medicine, 175(9), 1450-1458.
  • Loo Gee, B., Griffiths, K. M., & Gulliver, A. (2015). Effectiveness of mobile technologies delivering Ecological Momentary Interventions for stress and anxiety: a systematic review. Journal of the American Medical Informatics Association, 23(1), 221-229.
  • Merchant, Z., Goetz, E. T., Cifuentes, L., Keeney-Kennicutt, W., & Davis, T. J. (2014). Effectiveness of virtual reality-based instruction on students' learning outcomes in K-12 and higher education: A meta-analysis. Computers & Education, 70, 29-40.

Week 12
Nov 12
student presentations
Nov 14 Crowdsourcing in HCI research


  • Textbook chapter — Crowdsourcing in HCI Research


  • Guo, Anhong, Anuraag Jain, Shomiron Ghose, Gierad Laput, Chris Harrison, and Jeffrey P. Bigham. "Crowd-AI Camera Sensing in the Real World." Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2, no. 3 (2018): 111.
  • Andolina, S., Schneider, H., Chan, J., Klouche, K., Jacucci, G., & Dow, S. (2017, June). Crowdboard: Augmenting In-Person Idea Generation with Real-Time Crowds. In Proceedings of the 2017 ACM SIGCHI Conference on Creativity and Cognition(pp. 106-118). ACM.
  • Bernstein, M. S., Little G., Miller R. C., Hartmann B., Ackerman M. S., Karger D. R., et al. (2010). Soylent: A word processor with a crowd inside. In Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology (pp. 313–322). New York, NY: ACM
  • Heer, J., & Bostock, M. (2010). Crowdsourcing graphical perception: Using mechanical turk to assess visualization design. In Proceedings of the 28th International Conference on Human Factors in Computing Systems (pp. 203–212). New York, NY: ACM

Week 13
Nov 19
student presentations
Nov 21 Sensor data streams and Eye tracking in HCI


  • Textbook chapter — Sensor Data Streams
  • Textbook chapter — Eye Tracking: A Brief Introduction


  • Majaranta, P., & Bulling, A. (2014). Eye tracking and eye-based human–computer interaction. In Advances in physiological computing (pp. 39-65). Springer, London.
  • Duggan, G. B., & Payne, S. J. (2011, May). Skim reading by satisficing: evidence from eye tracking. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(pp. 1141-1150). ACM.
  • Froehlich, J., Larson, E., Saba, E., Campbell, T., Atlas, L., Fogarty, J., et al. (2011). A longitudinal study of pressure sensing to infer real-world water usage events in the home. In Proceedings of the 9th international conference on pervasive computing (PERVASIVE 2011) (pp. 50–69). Berlin: Springer
  • Tang, J., & Patterson, D. (2010). Twitter, sensors and UI: Robust context modeling for interruption management. In Proceedings of the 18th international conference on user modeling, adapta- tion, and personalization (UMAP 2010) (pp. 123–134). Berlin: Springer.

Week 14
Nov 26
student presentations
Nov 28 Introduction to Bayesian Analysis


  • Introduction to Bayesian Statistics. pp. 1 – 60 [link]
  • Bayesian Statistics explained to Beginners. [link]


  • Kaptein, M., & Robertson, J. (2012, May). Rethinking statistical analysis methods for CHI. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1105-1114). ACM.
  • Kay, M., Nelson, G. L., & Hekler, E. B. (2016, May). Researcher-centered design of statistics: Why Bayesian statistics better fit the culture and incentives of HCI. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 4521-4532). ACM.

Week 15
Dec 3
student presentations
Dec 5 Project discussions
Dec 11
Final project presentations Final project report due (11:59PM CDT Dec 11)
Lecture   |   Discussions


Before attempting any assignment, students are required to review and understand this course's collaboration policy.

Course Collaboration Policy

Discussion points (10%)

To receive a grade for discussion points, students are required to post discussion points on assigned research papers by the end of the day before the class AND be present in class to discuss them.

Annotated bibliography (10%)

At least 25 carefully chosen sources are required. Students should choose a topic related to their research interests. Writing an annotated bibliography is not akin to copying and pasting paper abstracts. Critical thinking and mature scientific reflection is required. Use the APA Style.

Helpful tips can be found here:

Paper presentations (30%)

Each student will present two to three research papers in this class. (The actual number of presentations will depend on the class's final enrollment.) The purpose of these presentations is two fold. First, students should summarize and provide an overview of the paper and then lead a class discussion focusing on the research method(s) used. The papers will pertain to the student's research interest and decided by the instructor/student. Papers scheduled for presentations will serve as assigned readings for that class meeting.

Course project (40%)

An integral component of this course is the semester-long project. Students should choose a research topic of their interest and a set of methods to use for this class. Both individual and group projects are allowed (depending on the class's final enrollment).

Part 1 (15%) Research proposal. Research design and conceptualization of a chosen research topic.

Part 2 (25%) Data analysis. Results and discussion.

Class participation + IRB certification (10%)

This is a gradaute-level seminar-type special-topics course. Students are expected and required to actively participate in class discussions.

Students will be asked to critique the mid-term and final project presentations of their peers, which will contribute to their class participation grade.

CITI course link. Take the Group 2. Human Subjects Research (HSR): Social – Behavioral - Educational Research Investigators and Key Personnel test and submit a completed training certificate to receive grade.

CITI completion certificate example

When in doubt, always ask the instructor or TA.

Assessment & Evaluation

Discussion points 10%
Annotated bibliography 10%
Paper presentation (2—3) 30%
Course project 40%
Class participation + IRB certification 10%
Total 100%



Letter grades are determined at the end of the semester. The default cutoffs are provided above. These boundaries may be adjusted downwards if necessary because of the difficulty of the assignments or quizzes, but the boundaries will never be adjusted upwards, so a final average of 90 is guaranteed to be an A. The boundary adjustment is done heuristically, and there are no grade quotas, no grade targets, and no centering of the class on a particular grade boundary.

Attendance, Assignment Deadlines, and Late Policy:

Attendance. Class attendance is not mandatory, but strongly recommended. Attending 60% of the class meetings is required. Students not meeting this course requirement may be penalized upto 10% of their total grade.

Students absent during a group presentation will receive zero for that assignment.

Assignment deadlines will be posted (and maybe updated) on the course page.

Lateness and Extensions

To give you some flexibility for periods of heavy workload, minor illness, absence from campus, job interviews, and other occasional (but often predictable) circumstances, you may use limited extensions on course deliverables, called slack days. Each slack day is a 24-hour extension on the deadline. You have a budget of 5 slack days for the entire semester, which you may apply to any combination of individual assignments. You can use at most 2 slack days for a given assignment. Assignments more than two days late will not be accepted.

To use a slack day, just submit the assignment late. You DO NOT need to notify the instructor. When we grade your assignment, we will see that you submitted late, and dock you the appropriate number of slack days in our records. You are responsible for keeping track of the slack days you’ve used. If you have used up your slack days, or exceeded the 2-day limit for a single assignment, you will need an instructor’s permission for more extension.

We expect that almost all your needs for extensions will be handled by slack. Only truly exceptional, extreme emergency cases will be considered for extensions after your slack days are exhausted. In particular, to receive said extension you will need to convince the staff that all five of your slack days were used for reasons that would have justified an extension. So use your slack budget wisely. Finally, we want to emphasize that slack days are for emergencies. Plan to submit every assignment on the real due date, and only use a slack day or two if something unexpected comes up last-minute. Do not treat slack days as pre-planned due date extensions. In particular, if you are already using two slack days for an assignment and email us at the end of your second day requesting an extension for a third day, we will probably turn you down.

Supplementary Materials

This course aims at assisting students to achieve an introductory level of mastery in user interface design and development skills. The course requires a variety of skills, including designing user interfaces, conceptualizing ideas, building computing prototypes, user evaluation, data collection, analysis, and interpretation. Because enrolled students may have different exposure to these different skills, this section lists additional materials that can be helpful to be successful in this course.

This is a fast-paced course, with multi-disciplinary reading materials, and students are strongly encouraged to pro-actively use the supplementary materials (freely available online) as deemed necessary.

Website Available support
Stack Overflow
Your one-stop shop for common coding problems.

Some courses related to this course:

  • Design Thinking for Innovation
  • Human-Computer Interaction
  • Bayesian Statistics: From Concept to Data Analysis
  • Social Computing
  • Designing, Running, and Analyzing Experiments
  • Interactivity with JavaScript
Helpful courses:
  • Introduction to Video Editing
  • Premiere Pro CC Essential Training (2015)

Guidelines & Policies

Attendance Policy. Class attendance is not always mandatory; however, research indicates that students who attend class are more likely to be successful. You are strongly encouraged to attend every class. Lectures may not recorded and there may not be slides. If you are unable to attend class, you should consider asking a classmate to take notes for you.

Academic Misconduct. All students should aspire to the highest standards of academic integrity. Using another student’s work on an assignment, cheating on a test, not quoting or citing references correctly, or any other form of dishonesty, unauthorized collaboration, or plagiarism shall result in a grade of zero on the item and possibly an F in the course. Incidences of academic misconduct shall be referred to the Department Head and pertinent University officials and repeated violations shall result in dismissal from the program.

All students are responsible for reading, understanding, and applying the Code of Student Rights, Responsibilities and Conduct and in particular the section on academic misconduct. Refer to UIC student affairs.

All students are strongly encouraged to read what constitutes plagiarism here and complete this short tutorial here. You must document the difference between your writing and that of others. Use quotation marks in addition to a citation, page number, and reference whenever writing someone else’s words (e.g., following the Publication Manual of the American Psychological Association).

Cheating. Cheating is an attempt to use or provide unauthorized assistance, materials, information, or study aids in any form and in any academic exercise or environment. A student must not use external assistance on any “in-class” or “take-home” examination, unless the instructor specifically has authorized external assistance. This prohibition includes, but is not limited to, the use of tutors, books, notes, calculators, computers, and wireless communication devices. A student must not use another person as a substitute in the taking of an examination or quiz, nor allow other persons to conduct research or to prepare work, without advanced authorization from the instructor to whom the work is being submitted. A student must not use materials from a commercial term paper company, files of papers prepared by other persons, or submit documents found on the Internet. A student must not collaborate with other persons on a particular project and submit a copy of a written report that is represented explicitly or implicitly as the student’s individual work. A student must not use any unauthorized assistance in a laboratory, at a computer terminal, or on fieldwork. A student must not steal examinations or other course materials, including but not limited to, physical copies and photographic or electronic images. A student must not submit substantial portions of the same academic work for credit or honors more than once without permission of the instructor or program to whom the work is being submitted. A student must not, without authorization, alter a grade or score in any way, nor alter answers on a returned exam or assignment for credit.

Fabrication. A student must not falsify or invent any information or data in an academic exercise including, but not limited to, records or reports, laboratory results, and citation to the sources of information.

Plagiarism. Plagiarism is defined as presenting someone else’s work, including the work of other students, as one’s own. Any ideas or materials taken from another source for either written or oral use must be fully acknowledged, unless the information is common knowledge. What is considered “common knowledge” may differ from course to course. A student must not adopt or reproduce ideas, opinions, theories, formulas, graphics, or pictures of another person without acknowledgment. A student must give credit to the originality of others and acknowledge indebtedness whenever: directly quoting another person’s actual words, whether oral or written; using another person’s ideas, opinions, or theories; paraphrasing the words, ideas, opinions, or theories of others, whether oral or written; borrowing facts, statistics, or illustrative material; or offering materials assembled or collected by others in the form of projects or collections without acknowledgment

Interference. A student must not steal, change, destroy, or impede another student’s work, nor should the student unjustly attempt, through a bribe, a promise of favors or threats, to affect any student’s grade or the evaluation of academic performance. Impeding another student’s work includes, but is not limited to, the theft, defacement, or mutilation of resources so as to deprive others of the information they contain.

Violation of Course Rules. A student must not violate course rules established by a department, the course syllabus, verbal or written instructions, or the course materials that are rationally related to the content of the course or to the enhancement of the learning process in the course.

Facilitating Academic Dishonesty. A student must not intentionally or knowingly help or attempt to help another student to commit an act of academic misconduct, nor allow another student to use his or her work or resources to commit an act of misconduct.

Right to revise. The instructor reserves the right to make changes to this syllabus as necessary and, in such an event, will notify students of the changes immediately.

Grievance Procedures. UIC is committed to the most fundamental principles of academic freedom, equality of opportunity, and human dignity involving students and employees. Freedom from discrimination is a foundation for all decision making at UIC. Students are encouraged to study the University’s Nondiscrimination Statement. Students are also urged to read the document Public Formal Grievance Procedures. Information on these policies and procedures is available on the University web pages of the Office of Access and Equality.

Recording and Copyrights. Audio/Video Recording: To ensure the free and open discussion of ideas, students may NOT record classroom lectures, discussion and/or activities without the advance written permission of the instructor, and any such recording properly approved in advance can ONLY be used solely for the student's own private use.

Copyrighted Material: All material provided through this web site is subject to Copyright and Fair Use laws. This applies, but is not limited, to class/recitation notes, slides, assignments, solutions, project descriptions, etc. You are allowed (and expected!) to use all the provided material for PERSONAL use. However, you are strictly prohibited from sharing the material with others in general and from posting the material on the Web or other file sharing venues in particular.

Course Evaluations. Because student ratings of instructors and courses provide very important feedback to instructors and are also used by administrators in evaluating instructors, it is extremely important for students to complete confidential course evaluations online known as the Campus Program for Student Evaluation of Teaching evaluation. You will receive an email from the Office of Faculty Affairs inviting you to complete your course evaluations and will receive an email confirmation when you have completed each one.

For more information, please refer to the UIC Course Evaluation Handbook.

Results for the “six core questions” will be published on the UIC course evaluation website.