Enhancing the lecture theatre for an inclusive hybrid learning experience

Thank you for registering to attend our online seminar and Q&A. You can find the recording of this event above.

On Thursday 9th September, we took a deep dive into how the team at Queen’s University Belfast have adapted their lecture theatres to enhance the learning experience for all students, whether accessing in-person, via lecture capture or in a hybrid setup.

We discussed how to tackle a range of real-world challenges, explored how you can make the most of a similar system, and gave a live demonstration of the solution from a 300 seat lecture theatre at Queen's University Belfast.

The event was followed up with a Q&A session from the team behind the project, including representatives from Pure AV and Sennheiser.

Below, you will find answers to the questions that were asked, some information regarding the Sennheiser range of solutions, and you can also find our full case study for this project here.

Thank you again for registering, and if you would like to discuss this project in greater detail with our account management team or one of our presenters, please contact us using the form below and a member of our team will be in touch.

 

Questions and Answers from the session:

Q: As part of the evaluation, it was mentioned that the lowest satisfaction (70%) was around some technology. Can you say more about what this was?
A: The 70% evaluation marking was on the student capture technology. The reason behind it was that they thought they would not need to use that technology as much as other parts of the solution. It wasn’t because they didn’t like it or thought it wasn’t suitable, but that they didn’t think they would make as much use of that side of the technology as the rest of the solution. I’d also like to add that we’d be more than happy to share the evaluation report with our colleagues in higher education. Please reach out to us over e-mail.

Q: Is MS Teams the tool used to broadcast teaching sessions?  If so, how are those sessions scheduled/managed?  Are they scheduled automatically (so totally hands-off), or do they need the lecturer to schedule and launch the remote elements?
A: We are big fans of Microsoft 365 products. We use Teams to deliver our teaching and learning and meetings as well. However, the solution can be used with any other platform that you run on a Windows PC, or an MS teams room kit or a Zoom room kit or any collaboration device that you might have in the teaching space. Regarding the second part of that question, we migrated our student groups and modules over to office 365 at the beginning of the last academic year. Originally, we were a very traditional University. We delivered face to face teaching, again just trying to work around the global pandemic, so we migrated everything to office 365. Part of the process was automated for the teaching staff in that their modules and their virtual classroom environments were populated automatically for them prior to the start of the academic year. From there, it was very easy for them to create the meetings, whether by scheduling them on their calendar or also by just initiating a call from within MS Teams, team which was also helpful for them to share content with their students.

Q: You mentioned that students joining remotely could ask questions through the Teams chat - are students able to ask their questions in Teams so everyone in the room can hear it?
A: Yes, an important part of this solution is to create a like for like in terms of creating that bridge between a virtual classroom environment and the physical classroom environment. Again, some students came into the classrooms last year with social distancing with fellow students joining from home. With the very simple to operate user interface, the teaching staff can simply click and drag a Microsoft Teams window to the second monitor, which will make that window appear on the projection screen. That will bring the video of the remote attendees into the physical classroom environment. It would then be a matter for the speaking student to unmute their microphone and speak to their colleagues in the classroom environment. Vice versa, the students in the classroom, will be able to speak to the remote students using the Sennheiser ceiling microphones.

Q: Have you applied similar technologies in seminar settings?
A: Yes, The project included a number of smaller teaching spaces or seminar rooms. The small spaces used a single Sennheiser TeamConnect 2 microphone to cover the whole room from an audio perspective and a Panasonic AW-UE4W fixed camera for video.  There isn’t the audience camera tracking in the smaller rooms we have in, the larger and medium spaces, but there is the same audio solution to provide conferencing.

Q: How many mics?
A: In the large lecture theatre, we use 8 Sennheiser TeamConnect 2 microphones to cover an area of 17m x 13m. In the medium lecture theatres, we used six microphones to cover an area of 14m x 10m. The small seminar rooms were typically 6m x 6m, so a single TeamConnect 2 microphone was enough to cover that area.

Q: Is MS Teams the tool used to broadcast teaching sessions?  If so, how are those sessions scheduled/managed?
A: At Queen’s, we mainly use MS Teams for remote teaching sessions. The sessions are often scheduled from within a Team by the lecturer (Meet Now or Schedule a Meeting options), which is still simpler than scheduling a meeting. However, some lecturers still create sessions by scheduling a meeting from their calendar and invite a whole Team or share the session details in a Team.

Q: Is the confidence monitor a separate pc logged into the meeting session or a hardwired switch?
A: We have a PC with dual monitors outputs. There are two outputs from the PC for a main and extended desktop that the PC deals with directly. We’ve got options to run the monitors from the switch, so there are options to switch between a direct link to the PC and via the switch. It can be done both ways.

Q: How well do the tracking cameras and ceiling mics work in a theatre with 300 people in the audience, making noise throughout a lecture?
A: It completely depends on how you are going to condition and monitor the data. We created zones for the microphones so that the right microphone is being used for the camera tracking. The other element is putting in the fail-safes within the programming itself. If someone is having a conversation or whispering in the audience the cameras don’t move over to them. You need to be monitoring the level of that signal, you also need to be monitoring how long that signal is present for. If someone coughs then you don’t want the cameras to point at them. A big part is making sure you are zoning the area and in the programming, establishing the audio level and the length of time the audio is present for before the camera tracks; get that right and you get a really stable, high quality solution.

Q: How do you manage background noise in a theatre filled with 300 people in the audience?
A: By clever use of prioritising audio such as the lecturers position then all other sounds are diminished in level as to not effect the intelligability of the main audio that needs to be heard. The same can be achieved with student questions as the way the Sennheiser TeamConnect Ceiling 2 works is that every other sound outside the 30 degree directive beam of TCC2 is diminished in level and only the question from the student is prioritised.

Q: I notice 2 camera feeds...how are they switched to be main etc?
A: Firstly the lecture can decide if they want to include the audience within the lecture capture / transmission. There is an audience participation mode that can be selected at any point in the lecture. This then enables the audience microphones and the associated processing and camera tracking. The system uses the audio from both the audience and the lecturer to decide which camera to switch to. If the lecturer is talking then the system switches to the lecturer camera, if a question is detected from the audience, then the system switches to the audience camera which tracks the location of the question. The switch does have a short delay to prevent over detection and allow a more produced transition between the two. It is an automatic switch and controlled by audio detection.

Q: How many classrooms and lecture theatres have been upgraded to use this technology for hybrid teaching in Queen’s University Belfast?
A: This project for us was a pilot and we upgraded 7 teaching spaces, some are large such as this one, others are very small teaching spaces. There are two reasons, one, for our teaching staff to have access to these different options and evaluate this technology and see what works best, what enhancements are needed. The other reason is to see how it works across different teaching spaces from a scalability point of view, will it work using the same user interface across different size rooms and with different architecture. This was a milestone that we were able to achieve very successfully. The solution worked very well in large, medium and small teaching spaces. From there we’ve put our teaching spaces in different categories across campus so from these benchmarks, we able to carry out bulk upgrades across campus.

Q: What happens if a student coughs?  Does the camera move to him/her?
A: You will have seen in the demonstration that there is a slight delay between Mohamed asking the question and the camera arriving there. This is really there to prevent the odd cough triggering a camera. You don’t want the camera darting around so the accuracy and information coming from the microphone is important, but there is also an element of programming within the QSC Core that manages to stabilise that situation and to make sure we don’t pick up a whisper. The microphones are that good so we have to make sure that we manage that and that’s why you see a short delay.

Q: What system is tracking the lecturer movement?
A: There are actually two cameras in the theatre. The lecturer camera is the Aver PTC500S, and this is used to track the lecturer. This camera uses it’s own optical motion sensor to track the lecturer. As the lecturer moves, the camera picks that up and follows that movement, that’s an optical detection, not an audio detection. If there’s no movement then it will automatically open up to a wide shot. The audience camera is the QSC camera, working with the QSC Core and the Sennheiser TeamConnect 2 microphones. That then is triggered by the audio side of things. The microphones are listening for a question and then directing the camera so it is using audio tracking to drive the camera position rather than optical tracking as used by the Aver.

Q: Nothing mentioned about loudspeaker setup. What do you use? And is it split into different zones?
A: Certain elements of the equipment were existing in this project, that includes the speakers. We had a lecture theatre with an existing sound system, we rewired and reused those speakers in the ceiling. There is an element of voice lift for the presenter and to some extent, a bit of voice lift for the audience. There is a mix/minus element of processing going on. The speakers were existing but zoned to match up with where the microphones are so we don’t get feedback.

Q: Is the Q-sys DSP the only option for this or will Sennheiser work with other DSP’s?
A: TeamConnect Ceiling 2 is perfectly fine to use with any good DSP that has good acoustic echo cancelation. If you’re talking about camera tracking, this is all done from the control side. The fact that the QSC processor does both DSP processing as well as control means you’ve got that element in one. Sennheiser are happy to work with any good DSP manufacturer with good echo Acoustic Algorithm. In fact there are a number Sennheiser already work with and here is a link to their partner page - https://en-uk.sennheiser.com/global-alliances

Q: Do you use lecture capture (Echo360) and if you do how have you found the quality of the audio that has been captured, does the recording also capture students talking, eating crisps etc?
A: We do use lecture capture and the solution we’ve put we’ve place will work with different lecture capture platforms, whether its a hardware recorder or a software recorder that would run on a windows PC, that works just fine and it can be automated and scheduled. This works with Extron control and Crestron control. The Sennheiser technology and the QSC plugin, can work with different control systems. With regards any noise in the lecture theatre, the system has 32 zones using the 8 Sennheiser microphones and will dis-regard noise and not move the camera until there is a continous speech pattern. If someone coughs, the camera will not pick them up and will focus on a speaker within the audience instead. Feedback from students is that yes, they would like to be taught live, but also that they would like the recordings to be available for revision purposes. All the lecture capture applications we’ve used so far have worked really well. The videos are almost professionally produced. So far so good.

Q: Thinking more specifically about audio capture with the Echo 360 -  How have you found the qualtiy of the audio that has been captured, does the recording also capture students talking, eating crisps etc)?
A: The TeamConnect Ceiling 2 microphones are auto beamforming and the produce a directive 30 degree beam to hone in on the loudest sound in the room. You can set a threshold of level to trigger the mics to become active and once they hone in on the audio triggering the mics then everything outside of that 30 degree beam is diminished in level and the intelligability of the direct audio should come through very clearly. If the sounds of the crips packet are so loud as to trigger the mics then this is not just a problem for the recording or the far end listeners but likely to also be a major issue for everyone in the same room and therefore it is likely that the person making that noise would be encouraged to eat a little quieter!

Q: Just to clarify, there are 2 tracking cameras in the LT:  one to track the tutor at the front, the other to cover all the students in the LT?
A: Correct. The Aver PTC500S tracks the lecturer and the QSC camera tracks the audience questions. Please see the question above regarding ‘What system is tracking the lecturer movement?’ For further detail and explanation.

Q: Are the video feeds going direct in the PC running MS Teams or does the room have a MTR device?
A: The video feeds from the presentation, cameras and audio are all converted into USB feeds and connected directly to the PC which hosts the Teams session, There is no other MTR device within the solution.

Are you ready to start your next audio visual project? Contact our team of experts today!