What we do
Our Research and Evaluation (R&E) team works one-on-one with MIT affiliates to design rigorous educational research studies that evaluate the academic student experience at MIT.
Our R&E experts provide services requiring strong social scientific and analytic research skills to affiliates across the Institute. The findings generated using this tested process aids our affiliates in making data-informed decisions about the academic experience at MIT.
If you choose to work with one of our R&E team members, we will:
- Work with you to solidify your research questions and needs.
- Organize and collect pertinent data.
- Analyze the data, using the relevant analytic approaches.
- Report findings in a way that is meaningful to you, often using data visualization.
In addition to designing and conducting studies, our R&E team collaborates with other offices on campus, such as Institutional Research, and especially across OVC. Through such partnerships, we can best address the growing data and analytic needs of our innovative MIT community.
Examples of our recent work
Comparative study of online computer course with a comparable residential course
In response to a request by two MIT faculty members in the computer science and electrical engineering department, a two-semester qualitative study explored 6.S064 (circuits and electronics), a study that compared students who took an online version with those who took a comparable residential version. The fall study focused primarily on the online course drawing on a variety of data that included surveys, interviews, exam scores, problem-set performance, and data from the online system. The spring course followed a similar pattern but included a paper exam administered to both the online and residential students at the same time in the same place.
Qualitative study of the impact of an experimental PNR policy on first-year student choice of major
As part of the FYX study team’s study of the experimental Pass No Record policy, a survey study examined how first-year students deferred some GIRS in order to explore potential majors; and it examined the subsequent impact of such decisions. The interview protocol emerged from discussions with FYX faculty members and administrators as well as notes from committee meetings. The fall surveys included interviews generating a large volume of data that required multiple rounds of coding in order to unearth answers to questions of interest to the committee as well as to identify new themes unanticipated prior to the start of the study.
Program evaluation study of an international education program
MISTI originally approached TLL to conduct a program evaluation of its summer internship program for undergrads and grads, a program that provides internship opportunities by matching students with researchers, companies, and other partners from around the world. The study consisted of three cohorts; each of which comprised 500-600 subjects who completed pre- and post-surveys. The study evolved through a close collaboration between the R&E team member and the MISTI faculty advisor, executive director, and 16 program managers, which led to the study’s final design and survey content. The survey covered core questions for all participants but also modular questions that explored specific aspects that occurred in selected countries. As a result of the collaborative effort, the results provided the MISTI staff with data useful in preparing students for their internships and demonstrated the impact of the program on students’ ability to integrate into different societies culturally and professionally. MISTI staff also asked the R&E team member to develop a similar study for their global teaching opportunities. This study included two cohorts of approximately 250 students.
Data visualization in program evaluation
We use data visualization to convey quantitative and qualitative findings to those with whom we work. Using Excel, Tableau, and other graphical tools, we generate a variety of visualization types that suit our audiences’ needs. For example, to convey students’ open-ended responses to a survey question, one R&E team member used a tool typically used in network analysis to demonstrate the connections between primary phrases and words in students’ responses. We use traditional tools such as Excel and newer visualization tools like Tableau. Clarity of findings and data accessibility drives our work in the area of data visualization.
In August 2018, Ian Waitz, the Vice Chancellor for Undergraduate and Graduate Education, asked TLL to assemble and lead a team charged with reducing the increasing number of surveys administered on campus. The team consisted of representatives from the Offices of the Vice Chancellor, Provost, and Student Life. In addition to submitting recommendations for future steps, the team launched the Data Talks series, with the mission of increasing the analytic capacity of Institute staff who collect and/or use data to make informed decisions.
Program evaluation study of impact of interdisciplinary the quantum information science and engineering program on doctoral students
The principal investigators of an interdisciplinary training program in quantum information science and engineering, funded by NSF IGERT, asked TLL to conduct the program evaluation. Based on discussion with faculty members, the evaluation study consisted of surveys and interviews with doctoral students of each cohort at 18 and 36 months into their training. The use of both survey and interviews which, in part, included some similar questions provided an opportunity to cross-validate responses. Moreover, the use of the two methodologies allowed for exploration of areas where were one methodology was more effective than the other. Data from three cohorts of doctoral students provided the basis for a series of reports that addressed classes, internships, relationships (advisor, faculty, other graduate students), research, research path, research networks, and view of program.
Interview study of impact of a postdoctoral training program on how fellows evolved as researchers
The principal investigators of the M+Visión Fellowship in Translational Biomedical Imaging asked TLL to conduct a longitudinal, multi-cohort interview study of the impact their postdoctoral training program had on the fellows. For each of the three cohorts, TLL conducted interviews at the beginning of training and at the end of the first and second year of training. The interviews were audiotaped, informal, conversational, and, on average, lasted more than an hour. The value of using an informal, conversational approach in providing fellows the opportunity to describe their experience in their own words, which, in turn, may have motivated them to respond more deeply and openly. The study provided evidence of how the program made an impact on how fellows approached identifying new lines of fruitful research, critically examined their research and that of others, and developed research teams and networks.