Unknown Territory: K-12 STEM Summer Exploration Through Zoom

: When COVID-19 shut down in-person programs across the world in 2020, we (the National High Magnetic Field Laboratory (MagLab)) as practitioners were in unknown territory. This paper outlines the structure of a 10-week online Summer Exploration Series (SES) program along with the evaluation performed by staff at the MagLab in 2020. The goal of the SES program was to increase youth’s interest in STEM and knowledge of STEM careers relevant to materials science. Each week included live and asynchronous components. All of the participants (n=86) rated the program as above average or higher, crediting the program with teaching them about new STEM careers/topics and increasing their interest in STEM. Our evaluation indicated that the live sessions, particularly those that offered more opportunities for interaction, were rated higher than the asynchronous sessions, providing evidence to the benefits of live - even if online – sessions on increasing interest in STEM. The lessons learned through the program can inform other organizations as we continue into the new normal.


INTRODUCTION
Informal STEM education programs (e.g., summer camps) can play a crucial role in sparking and maintaining the interest of young people in STEM and STEM careers (Chan et al., 2020;Hughes and Roberts, 2019;Riedinger and Taylor, 2017;Roberts and Hughes, 2019). These programs provide opportunities for youth to engage in authentic STEM inquiry where they can meet role models and develop STEM skills in a safe and nurturing environment (Barron and Bell, 2015;Dahn and DeLiema, 2020;Hughes et al., 2020;King and Pringle, 2018;Schmidt et al., 2020). Most research and practitioner papers have focused on the hands-on and in-person engagement that benefits participating students (Hughes et al., 2020). Hence, when COVID-19 shut down in-person programs across the world, we as practitioners were in unknown territory.
The National High Magnetic Field Laboratory (MagLab), is a large interdisciplinary facility with a commitment to education and outreach. The Center for Integrating Research and Learning (CIRL) is the education arm of the MagLab. CIRL has run middle school summer camps aimed at improving youth's interest in STEM and STEM careers since 2006. The camps seek to provide an environment where students can explore their STEM interests and participate in hands-on activities to develop their identities as potential scientists. CIRL has historically prioritized providing spaces for students to engage in the practices of science, rather than teaching specific disciplinary content or standards. Additionally, CIRL provides this experience in a real lab setting. The summer camps show students what a career in STEM looks like. MagLab scientists make frequent appearances in summer camps, by giving tours of their lab, conducting activities related to their research, and telling their career stories. This integration of MagLab scientists and CIRL educators yields an authentic and engaging experience for students who are beginning to explore and refine their STEM interests.
Due to COVID-19, CIRL was unable to hold any in-per-son summer camps for the first time in our history. Within a month, CIRL had to cancel our in-person camps and needed to develop a program that could be hosted online but still maintain the original spirit and motivation of the traditional summer camps. For the 2020 summer, we created the MagLab Summer Exploration Series (SES) to maintain our efforts of engaging youth in STEM and connecting youth to the science at the MagLab. Previously, summer camps provided a high-touch experience for approximately 24 students per camp to deeply engage with STEM disciplines and role models. However, with this depth of experience comes a trade-off as it reduces the number of students we can accept. One advantage of an online program was that more students could be included in the experience. Rather than trying to force the traditional in-person model into an online format, we decided to embrace this possibility and expand access to the lab to more youth than we traditionally serve. Consequently, our goals shifted slightly from the traditional in-person camps. Table 1 compares the goals of the in-person summer camps to the goals of the SES to highlight the change in focus for this new virtual program. The goals driving the traditional summer camps focused more on providing individuals with exposure to STEM careers and professionals, opportunities to engage in STEM activities, and chances to develop and refine skills to help them succeed in STEM. The goals for the virtual SES program focused on creating and expanding access and providing students with an idea of the breadth of STEM subjects and research, particularly as they apply to research at the MagLab. The SES program focused on creating a broad experience which would appeal to a larger and more diverse audience. Rather than focusing intensely on a few topics, we decided to offer a broad array of topics so that youth could tune in to only the sections that interested them or join all sessions to get a better understanding of STEM disciplines represented at the MagLab.

OVERALL PROGRAM DESIGN
The SES program was designed to be a flexible and pandemic-safe alternative to our traditional in-person summer camps, catering to a K-12 audience, with a focus on middle and high school. In order to accommodate a diverse audience, the SES program was designed to be a modular program, with students able to customize their experience by choosing to participate in the "modules" that best aligned with their interests. Overall, there were two dimensions for each module: weekly theme/topic and daily activities. Each week of the program featured a different STEM discipline represented at the MagLab (topic/theme). Each day of the week included a specialized activity (type). The daily activity type was consistent across weeks (e.g., every Monday the activity would include an introduction to the new weekly theme). So even though the overall topic changed each week, the structure of each day of the week remained consistent. Table 2 shows the template schedule for each week, including the focus, the activity, platform, and timing for each day.
The structure of each week remained consistent even though the topic changed weekly. This allowed students to self-select into the topics and types of activities that inter-

Both Programs Summer Exploration Series
• Hands-on activities with materials provided, lasting anywhere from thirty minutes to two hours, for a total of eight hours per day for five consecutive days.
• Live interaction with STEM role models with the ability to have personal interaction and direct conversations.
• Access to MagLab facilities for participants in the local community Introduce participants to STEM careers and role models • Live zoom presentations three times per week over the course of ten weeks which introduced participants to science topics, careers, and role models.
• Access to one pre-recorded interview per week with a STEM expert, over the course of ten weeks.
• Access to a curated list of articles and videos from the MagLab website once per week for ten weeks.
• Access to MagLab research and scientists for participants across the US.
• Interactive challenge once per week for ten weeks that allow the youth to choose the format and depth of their participation while the sharing of the challenges shows them alternate experimental designs and interpretations of the STEM challenge. ested them most. For example, if a student was interested in learning only about career options, they could tune in only on Wednesday and Thursday of each week. The other dimension of the program was the weekly topics. To showcase a diverse array of STEM disciplines, each week focused on a different area of STEM represented at the MagLab and culminated in a presentation of student submitted challenge responses. Table 3 presents each of weekly topics and related challenge. The SES program had synchronous and asynchronous options for participants to accommodate as many different schedules as possible. All synchronous activities were conducted via Zoom, which provided a platform for students to actively participate in sessions from any location. When planning the SES program, we prioritized flexibility and ease of access, as the program was being offered at a time when Zoom was less ubiquitous. To facilitate ease of access, the Zoom webinar platform was used over the traditional Zoom account. With the Zoom webinar platform, we were able to create a repeating link so that the same link could be used all summer, rather than asking participants to keep track of 30 different links for each of the synchronous sessions. Traditionally, using the same link poses security threats to Zoom meetings. The webinar platform accommodates for this by disabling attendee video and giving the host the option to limit chat messages so that they could only go to the hosts. Youth participants could message the hosts and panelists, then their questions or topics from the chat were shared verbally by the hosts so youth could still be active participants. Additionally, the Q&A feature of Zoom Webinars was turned off, since the submitted questions were broadcast to all attendees. These settings did inhibit some levels of engagement but gave us much greater control over the security of the meetings, which we felt was important given that the program participants were minors.
When it came time to plan out the final details of each individual activity for the program, the focus was on creating experiences similar to our traditional summer camps and highlight the diverse areas of STEM that are represented at the MagLab. Given this focus and our priority of creating an open and flexible schedule, five program goals were developed to help drive the final stages of planning: • Expand middle and high school students' access to the MagLab through virtual education content and resources. Create access to the MagLab for students outside of the local area through virtual education content and resources.
• Increase students' knowledge and recognition of science topics that are integral to the MagLab's research agenda through themed activities and presentations.
• Provide students with knowledge of the broad nature of careers in STEM fields by showcasing the diversity of careers at the MagLab.
• Maintain or improve students' interest in STEM fields through themed activities and presentations.
The final element in the overall planning stages was developing a recruitment strategy. Advertising for the SES was similar to the recruitment efforts for our traditional summer camps, with the exception of broadening the advertising to a national level. The program was featured on the MagLab website's carousel as the first highlight. Additionally, emails were sent out to the MagLab's Educator's Club mailing list (educators and parents who sign up for frequent announcements), and to all MagLab camp alumni. To reach the national level, advertising was done through the MagLab's social media platforms which have nearly 12,000 followers/fans across all of its networks. What was missing from our traditional recruitment efforts were the mailing of posters to local schools, since schools had been shut down due to the pan-

Weekly Topic Weekly Challenge
Week 1 Intro to the MagLab Strongest Magnet: Experiment with the magnets in your home to find out which is the strongest magnet.

Week 2 Introductory Physics
Demagnetizing: Use items around your home to try to weaken the magnetic fields around your magnets. Try to see if you can demagnetize one of your magnets.
Week 3 Intro to Electromagnetism Electromagnetic Field: Use items around your home to show a magnetic field created from electricity affecting a weaker magnet nearby.

Week 4 Magnet Science & Technology
Homemade Electromagnet: Make a working electromagnet. For added difficulty, manipulate the strength so that it picks up exactly 7 paperclips (or staples, or pins).
Week 5 Engineering at the MagLab Engineering Design: Design and/or build one of these items that is currently needed at the MagLab: 1. a door stop, 2. a hands-free door handle, or 3. a hand sanitizer holder for walls or doors.

Week 6 Superconductors & Cryogenics
Viscosity of Liquids: Collect data on the viscosity of liquids around your home. If you feel daring, experiment with if/how temperature affects them.
Week 7 Materials Science Making the Best Crystals: Take on the role of a crystallographer and create the best possible crystals. "Best" could mean size color or clarity, I want you to choose one and perfect your technique.
Week 8 Life Science Magnetic Field Machine: Build a machine that allows you to see something hidden using magnetism. Show what it looks like and how it uses magnetism to show something that would otherwise be unseen.

Week 9 Biology & Chemistry
Homemade Emulsions: Make your own emulsion. Monday's Zoom meeting gives you a few hints at the ratio, so this week you are expected to make your own emulsion (mayonnaise) using oil, 1 egg, vinegar or lemon juice, and Dijon mustard.
demic. The pandemic also removed our ability to advertise at numerous in-person events such as school science nights and local weekend informal education events and festivals. All advertising was digital, through email or social media.

PROGRAM ELEMENTS
To mimic our summer camps, which occur over five weekdays, we organized the SES similarly. Each weekday had either a synchronous or an asynchronous component. Participants who decided to work on the weekly challenge could use the daily components to inform their choices for the final weekly challenge submission. Synchronous Activities. The live, synchronous events were held on Mondays, Wednesdays, and Fridays. On Zoom Mondays we introduced both the topic and the weekly hands-on challenge through a live Zoom. The SES host was joined by a MagLab scientist or engineer who specialized in the week's theme. These STEM professionals served as role models for the students. They introduced the topic and explained how it connected to the research being done at the MagLab using diagrams and/or videos. This presentation gave participants an explanation of the science and engineering so they could apply that information to the weekly at-home challenge, which was introduced and explained after the expert finished giving their presentation. CIRL made sure to involve a diverse group of scientists and engineers from the MagLab to highlight different perspectives. Diversity included gender, race/ethnicity, and career level (undergraduate, graduate students, postdocs, and faculty/staff).
Ask Me Anything (AMA) synchronous interviews were offered on Wednesdays and led by the SES host. The AMA featured a MagLab early career STEM professional, which included graduate students. K-12 participants were able to type their questions to the chat moderator who could then pose those questions to the host and guest. Questions ranged from what their favorite part of their job was, to what their favorite fandom was outside of their work. The purpose of the open format of the questions was to allow the SES participants to connect to scientists as everyday people and challenge commonly held stereotypes about scientists and engineers. Some of the more popular questions included asking about food, hobbies outside of work, and how/if they personified the machines they worked with.
Finally, on Share Fair Fridays, the host would display the challenges submitted by participants that week. Participants were able to submit via email either descriptions, videos, or photos of their submission, which created a variety of documentation for other students to view in the Friday sessions. The Friday session served as a way to both recognize the work that students performed on the weekly challenge individually and show the group the different ways of solving the challenge. The presentation of challenge submissions created additional opportunities for engagement for the students who opted out of completing the challenge for the week. The later weeks in the program also included a scientist or engineer with knowledge on the subject to give expert insight into each of the submitted challenges.
Asynchronous Activities. The asynchronous events were held on Tuesdays and Thursdays. On Tuesdays, participants were encouraged to visit the MagLab website to view specific videos and tutorials that focused on the weekly theme. Students were given a PDF document with links which leveraged the MagLab's existing online education content, so we did not need to create new videos and articles for each week. The document with the collection of links each week   ensuring the challenges could be completed with everyday items at home, we met our goal of equitable access for all students being able to engage in scientific inquiry no matter what their family income. Table 3 presents the full list of weekly challenges as they were presented to the participants on the website. At the conclusion of the program, students were mailed prizes based on the number of weeks in which they submitted a challenge. Prizes were awarded on a cumulative basis, so as students completed more challenges, their prize collection grew both in quantity and quality of prizes.

EVALUATION METHODS
Evaluation for the program was conducted by CIRL's internal evaluator. As mentioned earlier, the entire CIRL team met during the planning stages to identify the five goals listed in Table 1, which helped drive both the planning and evaluation of the program. The evaluation assessed the extent to which these goals were met, and participants' satisfaction with the major elements of the program design. These major elements included: the six weekly activities (Zoom Mondays, Tuesday Links Exploration, AMA Wednesdays, Thursday Career Interviews, Friday Share Fair, and the Weekly Challenge), the weekly themes, and the program overall. The evaluation effort leveraged data from multiple sources. The timing, frequency, and associated metrics for each data source are summarized in Table 4. The primary data sources for the evaluation were: 1. Program Registration Form: collected demographic information on interested participants and collected contact information for each participant. Students had to complete this form to get the Zoom links.

2.
Weekly Surveys: at the end of each week, a survey went out to everyone who registered for the SES program. The survey asked which activities they participated in, what their favorite and least favorite activity of the week was, their overall satisfaction with the week, and whether they learned new things that week. was posted to the SES program webpage on the MagLab's website to ensure easy access to participants for the duration of the program. The links were chosen because of the grade level the articles were written, as well as the connections to the week's topic. The document included links such as, interactive demonstrations, virtual tours, interviews with scientists, and news articles. Each weekly PDF included a link to short introductory videos by researchers at the MagLab where they explain what they do in two minutes or less. For example, during week 2, participants were encouraged to visit a MagLab website post on our Magnet Academy page that explained magnets, then links were provided to three demos on the website that showed magnetic field lines, how Van de Graaf Generators Work and how Microwaves work. Then participants were encouraged to visit interactive tutorials on the website that showed participants how magnets and compasses work. On Thursdays, participants viewed an in-depth (lasting about 30 minutes) pre-recorded career interview with a MagLab scientist. This recorded interview focused on the scientists' path to the MagLab including what sparked their passion in STEM, their choices in high school classes, and how their university decisions launched their trajectory to where they are today. Additionally, the scientists were asked why the MagLab was the best location for their research and to share some of their favorite aspects of being a researcher as well as any moments of levity during their careers at the MagLab. The goal of these videos was to highlight the many pathways to STEM careers so that students could see there are multiple ways to pursue a career in STEM. These pre-recorded videos provided a more in-depth career trajectory that differed from the informal and shorter format of the Ask Me Anything sessions. These two different formats were chosen to reach participants who might be looking for advice on STEM careers, and so participants could connect with the scientists as individuals.
Weekly Challenge. The weekly challenge was designed to be the hands-on component of the SES. Each week's challenge was connected to the weekly theme. Participants were asked to attempt the challenge and then submit their work through email. Participants were encouraged to do their best with whatever materials they had available to them at home since many families were in quarantine during this time. They were told that any submission would be accepted, video, photo, or even a written description of what they did. For example, when asked to test the strength of their magnets in week 1, submissions included: (1) a description of an experiment, "I put in an iron bar on a table, and with the magnets, then I measured the length of the magnetic field with a ruler"; (2) photos of magnets with varying number of paperclips attracted; and (3) a video portraying their challenge as a one-on-one elimination competition, with narration similar to what you would experience at a sporting event. By 3. Post-Program Survey: at the conclusion of week 10, all participants were sent a combined week 10 and post-program survey. In addition to the week 10 questions, the survey also asked about participants' experiences in the program overall.
4. Zoom Reports: the Zoom platform provided attendance reports which included names of all attendees and how long they attended the meeting. Additionally, Zoom provided transcripts of the chat from each meeting.

5.
YouTube Metrics: all asynchronous videos used for the program were hosted on YouTube. YouTube provides metrics on view counts, audience, and audience retention for videos on the platform.
For each of the six weekly activities, we were interested in capturing metrics representing both attendance and engagement. However, the differing nature of each activity meant that the concepts of "attendance" and "engagement" had to be defined for each activity. In general, attendance for synchronous events was operationalized as the number of students who logged in to the Zoom meeting. For asynchronous activities, attendance was operationalized as unique views of the videos during the week the video was featured in the SES program. Engagement in synchronous sessions was generally measured by the number of relevant questions participants asked or answers to the presenters' questions they provided. Audience retention, i.e., the average percentage of viewers that watched the video to the end, was the proxy for engagement in asynchronous activities. Full explanations for how attendance and engagement were defined for each activity are presented in Table 5.

EVALUATION RESULTS
Overall, 100% of students rated their experience in the SES program as "above average" or "outstanding". This corresponded to an overall rating of 4.86 out of 5. The participants indicated that they learned about new STEM jobs and topics, and became more interested in science and engineering. The next sections present detailed results on the progress towards the program goals, levels of program uptake, and the perception of the various program elements.
Program Uptake. One of the challenges of virtual programming is attracting all interested participants to and retaining them through the program. Similarly, we saw that of the 184 students registered for the program, only 46.7% of them attended at least one live session. Our evaluation did not cover why these individuals did not attend the program, but this may be an area of interest for future feasibility and evaluation studies. In total, 86 youth participated in at least one session in the SES program. In terms of demographics, there did not appear to be any differential rates of program uptake based on race, ethnicity, gender, or grade in school. We did, however, see a decline in the percentage of participants from Title 1 schools from registration to attendance. Table  6 shows the breakdown of program registrants (i.e., those who completed the program registration form) versus participants (i.e., those who attended at least one session).
Compared to the demographics of our in-person summer camps, the SES program included a greater percentage of male and Asian participants and a lower percentage of stu-

Program Registration Form
Students completed this form one time at the beginning of their participation Demographics

End of Week Survey
Students completed this survey once per week, at the end of each week of the 10-week program Student Satisfaction, Attendance, Student Learning

End of Program Survey
Students completed this survey once at the end of the program. This survey was combined with the Week 10 weekly survey.

YouTube Metrics
Metrics were pulled by the manager of the MagLab YouTube channel once per week, for the previous week's metrics

Zoom Attendance Reports
Attendance reports were downloaded after each session and reviewed by the evaluator once per week.

Zoom Chat Transcripts
Chat transcripts were downloaded after each session and reviewed by the evaluator once per week.

Weekly Challenge Submissions
Program Manager uploaded the submissions each week, and reviewed by the evaluator once per week.  dents from Title 1 schools. Two of our three in-person summer camp programs are only for girls, which explains the large difference in percentage of male participants in SES compared to the summer camps.    (i.e., how long viewers watched the video) for the Tuesday and Thursday activities each week. In general, audience retention was lower on Thursdays. This could be because the students were less interested in the video topics, or because the videos on Thursdays were much longer. Overall, the asynchronous activities did not score as well as the live the sessions with the participants. At the end of each week, participants were asked to select their favorite and least favorite activities. Students selected these activities as their favorites in low rates. Additionally, the Tuesday Links Exploration had relatively high rates of students selecting that day as their least favorite activity (for example, 42.9% of survey respondents in weeks 2 and 8 indicated the Tuesday activity was their least favorite). The weekly challenge was also able to be completed fully asynchronously. Students who missed the challenge description during the synchronous Monday session, could watch a recording posted on the SES website. Participants could also email the SES host for a written description if they could not attend the Zoom Monday synchronous session. Of all the program elements, the students responded most enthusiastically to the weekly challenge. In total, we received 142 submissions to the weekly challenge over the course of the program, and 7 participants completed all 10 weekly challenges. Table 11 shows the submissions and ratings for each week's challenge. Overall, the number of challenge submissions remained stable over the course of the program (apart from the first week, which received a few more than the other weeks), indicating consistent engagement with the weekly challenges over the course of the program. Additionally, the weekly challenge was the activity most often endorsed as participant's favorite activity of the week (ranging from 14.3% to 71.4% of survey respondents indicating this was their favorite activity of the week), and rarely endorsed as their least favorite. In 6 out of the 10 weeks, 0% of survey respondents indicated the weekly challenge was their least favorite activity of the week.

Attendance
Weekly Topics. In order to evaluate the performance of the weekly topics, we examined attendance, engagement, participant satisfaction, and the relationship between the topic and participants' interest and knowledge. Overall, the metrics for all the weeks were strong and relatively consistent, with participants indicating that they learned new things each week and were satisfied with the weekly topics (the lowest satisfaction score was a 4.23 out of 5). However, we were interested in using the evaluation data to select a few of the higher-performing topics should we decide to offer a shortened version of this program in the future. Attendance and engagement were relatively consistent across the weeks. To determine the most impactful topics, we turned to participants' perceptions based on survey data. Weeks 2 through 6 had some of the highest ratings for overall satisfaction and interest, and 100% of the survey respondents indicated that they learned more about the week's topic during those weeks. Given the strong performances in weeks 2 through 6, and their more concrete connections to the MagLab, these would be themes we would focus on if we conduct this pro-

LESSONS LEARNED: FOR EVALUATORS
The evaluation of this entirely virtual program created a distinct learning opportunity because we could not rely on our traditional in-person methods of evaluation. From this experience, there were a number of lessons learned that we will outline for other evaluators of virtual programs. The different nature of virtual interactions and the use of new technologies were the two biggest factors that impacted the evaluation process of this program. The three greatest changes to the evaluation procedures were: the formalization of attendance and engagement as official metrics for the program; the ability to leverage technological platforms to yield additional data; and the complexities of making comparisons when different data sources are utilized.
In our traditional summer in-person programs, participants typically show up for summer camps they have signed up for. However, given the flexible nature of the SES program and its occurrence in a virtual space, attendance became a variable. This necessitated building ways of measuring attendance both within program elements and in the program overall. Participant engagement, or the level at which the youth actively and enthusiastically participated, was variable even in in-person programs, but gauging participants' engagement as a piece of formative evaluation be-came much more challenging in the virtual space. Engagement between participants was limited because of our choice to create a safe and secure space without negative comments or Zoom bombing. Students were not able to talk directly to each other or the guests, so engagement in this space translated into sending one-way comments or questions that the hosts and panelists could then share more broadly. In in-person summer camp settings, attendance and engagement are measured more passively and addressed immediately. Camp teachers can see when a participant has "checked out" and have strategies to re-engage them, without needing to see data prompting such an intervention. For virtual programs, evaluators must be much more intentional about collecting information on both attendance and engagement, in order to better understand what participants' experiences were like within the virtual program environment.
To collect data on attendance, engagement, and the other metrics outlined in the results section, the virtual element provided more diverse sources of data than in a traditional in-person setting. The built-in metrics of the Zoom and YouTube platforms provided new sources of information and shifted much of the data-collection load away from the surveys. These additional sources of data were tremendous assets in the evaluation of the program. However, with new data sources come new considerations. All videos used during the SES program were hosted on YouTube, a platform that has metrics for views. The program leveraged existing videos and content for the Tuesday activity. This meant that the videos already had views before the program and were publicly listed and available to anyone with access to You-Tube. For these videos, we were able to filter the YouTube metrics to a particular time range, but we were not able to fully parse out whether the traffic during that time window could solely be ascribed to the SES program. This made using YouTube metrics to evaluate the Tuesday activities much more of a challenge, and our findings were less clear. This was not the case, however, for the Thursday career interviews. These videos were created for the program specifically, and the link to the unlisted video was only available via the program website. This does not strictly prevent non-SES traffic to the video, but it did markedly improve the signalto-noise ratio of the YouTube videos. We would advise programs to use newly created/posted videos if they wish to use YouTube viewership metrics in their evaluation.
Lastly, the use of several different data sources for disparate activities required a more thoughtful and intentional approach when making comparisons of results. The SES program contained multiple types of activities, which were offered through diverse modalities. Attendance and engagement had to be operationalized differently for synchronous and asynchronous program elements, which required new evaluation tools and techniques to make comparisons across the activities. We will highlight here, the most suc-  cessful strategies for our program. First, based on our goals for the program, we determined metrics for success (e.g., engagement, attendance). Prior to designing the data collection strategy, we decided which elements of the program we would want to compare and made sure that the data collected for these elements would allow for the clear and relevant comparison during the final analysis phase. Second, we built in redundancy to the data collection processes for each metric. For example, Zoom and YouTube provided data to help measure attendance, but we also asked for self-reported attendance on the weekly survey. This allowed us to check consistency between the two sources to identify potentially flawed data collection methods. This consistency check is what made the issue of YouTube metrics for Tuesday videos apparent. Since we had redundant data sources, we were still able to have some metric for attendance for Tuesdays even after we discovered one of our sources was flawed. Finally, we also leveraged standardization across scores for the comparisons that were essential but pulled from different data sources or operational definitions (see Figures 1 and 2 for an example). This allowed for comparisons even when the raw numbers were difficult to compare and contrast.

LESSONS LEARNED: FOR PRACTITIONERS
The pandemic and its timing made planning for the drastic change to virtual programming rushed and difficult. However, there were four main lessons learned by CIRL based on our implementation of the MagLab Summer Exploration Series. These include recruitment, planning, implementation, and post program reflection.
Recruitment. The active recruitment for participants could not begin until the program's format and schedule were determined. Once those tasks were done, advertising went out to MagLab educators and camp alumni mailing lists, as well as to the local school districts. For in-person camps, we create posters and hand those out at local schools and outreach events. Because of the pandemic, we were not able to reach these audiences. Advertising was done through the MagLab website, social media, and posting on professional networks. While they worked well, an earlier start to the recruiting would have helped increase participation numbers. If we were to do this program again, we would reinstate the poster distribution and utilize classroom and community outreach to advertise the program. To address the differing uptake levels for Title I schools, we would include a question on the application asking if internet was an issue for applicants to attend. This would allow us to determine if we need to work with local libraries and schools to provide students with internet access.
Planning. As stated before, planning for the SES program was rushed. To establish a starting point, the CIRL team researched how other informal STEM education spaces had conducted virtual programs. We met weekly throughout the early part of the pandemic to compare notes and collectively viewed various program websites to determine which would fit best for our program. We knew the MagLab had high quality articles, demonstrations, tutorials, and videos that we could easily incorporate into a program. CIRL is proud of its in-person summer camps and the engagement and access students have during these programs, so our early planning attempted to mimic the structure of these in-person programs. Originally the program was planned to have five days a week of synchronous engagement. This idea was abandoned for multiple reasons. First, trying to create five days of new content for ten weeks was going to be difficult to develop, especially when most of the MagLab staff were in quarantine. Second, we recognized that participants might not be able, or even want, to attend live sessions every day for ten weeks. Third, we knew that hands-on activities that include exposure to role models are important to improve and/or maintain youths' interest in STEM. We determined that a combination of synchronous live events (including AMAs with scientists) along with asynchronous sessions could mimic our in-person camps in a flexible and COVID-19-safe environment. We debated mailing materials to youth for the weekly challenges but decided instead to create challenges that would use everyday materials rather than mailing materials since the CIRL team was also quarantined in our own homes. However, if we were to do this program again, we think creating an activity kit for registered participants would create a stronger brand and connect members together in a way that the SES program could not. Implementation. During the program, we learned lessons in terms of frequency of activities and engagement of audience members that can help us tailor the program if we choose to do it in the future. Based on our experience, we found that it was difficult for students to attend every day, and this will become even more difficult now that we are no longer in a quarantine phase and families begin to take vacations and youth engage in in-person camps. As a result, if we were to do a virtual program again, we would probably commit to 1-2 live sessions a week over a summer or school year.
In terms of engagement, during the live sessions we noticed that some participants were very "vocal" on chat whereas others were not. Having a chat manager made it easier for participants to feel heard as the chat manager could respond in real time as well as notify the presenters of audience questions. In terms of those who were not as engaged, in the future we would create a more secure zoom link so that participants can use their video and engage verbally. Another engagement issue was the difference in participating in sessions live versus watching the sessions after the fact. In order to allow participants to jump into the program at any point during the summer, all the live sessions were recorded and posted. Those students who participated in the live sessions were able to engage in candid moments with the host and guest. While those students who tuned in to watch the recorded versions of these sessions missed the opportunity to engage. Consequently, as we decide how to tailor this program for the future, we will be considering what is lost by holding only live sessions for a virtual program, and if recorded sessions can be tailored for a particular audience. We hypothesize that a program solely comprised of live sessions will create a special cohort and opportunity for participants. Additionally, we are considering editing recorded sessions to be more engaging to viewers as opposed to simply recording live sessions and making them available with no edits.
The final implementation issue was the acceptance/submission of the weekly challenges. Each week's topic was connected to a weekly challenge for the participants to engage in. As a reward for participating in the weekly challenges, each submission earned them a point, and these points were used to determine which prize tier they would earn at the end of the program. In order to make participation equitable for all students regardless of access to materials or technology, participants were allowed to submit their challenge results in any form. This included photos, videos, and written explanations in a document or in an email. These were emailed to the program manager, who collected them for sharing on Friday. One issue we encountered early in the program was submissions were going into the program manager's spam filters. This was fixed by checking that filter weekly to ensure that they were received and shared. Creating a shared public google drive was considered but decided against due to privacy concerns (e.g., names and images of minors). A few times a file was submitted in an unrecognized file format, but these were fixed by emailing the submitter and asking them to send the file in another format.
Post Program Reflection. We are extremely proud of the program we created, especially with the restrictions and constraints of the pandemic. One of the opportunities that this virtual platform provided was a broader audience reach. For decades, our education and outreach programs have been limited to local students. While this has merit in that we are helping students in our local community to learn about STEM, STEM careers, and the MagLab, we have often wondered how we could gain a national presence. The SES program was our first foray into a national program. We are currently reaching out to colleagues in informal STEM education (ISE) to determine the pros and cons of national versus local programs as tools that can improve and maintain STEM interest at a crucial age. We hope to learn more from our colleagues on what new challenges arise when creating a cohort in a virtual space and how best to engage students in hands-on activities though a virtual medium. The SES program helped us to begin to see how ISE can engage audiences in a virtual setting, which will help us continue to reach youth as we move forward to the new normal post-pandemic.

AUTHOR INFORMATION Corresponding Author
Roxanne Hughes. National High Magnetic Field Laboratory, Florida State University. hughes@magnet.fsu.edu

Author Contributions
The manuscript was written through contributions of all authors. All authors have given approval to the final version of the manuscript.

FUNDING SOURCES
The National High Magnetic Field Laboratory is supported by the National Science Foundation through NSF/DMR-1644779 and the State of Florida.