Generating QCM-D data is relatively straightforward, but the analysis and interpretation can sometimes be tricky. QCM-D data contains a wealth of information, and each system studied is unique, offering different possibilities for deeper analysis. So how can this be addressed? At QSense User Days 2023 we had the privilege of hosting a panel session featuring four seasoned QCM-D users, each with decades of QCM-D experience. During the session, I seized the opportunity to inquire about their top advice for newcomers to QCM-D to expedite the learning process. This is what the panel said:
Q: What would be your best advice to a new user of QSense QCM-D to facilitate data interpretation and analysis?
Prof. Jackman:
1. Get a basic understanding of your system and adopt best practices from literature
- Be friends with you and Kenneth* would be my first answer. 😊 More seriously, I would say, have a thorough understanding of similar systems from the literature. Try to really understand what people before you have done and do not get too creative with modeling if you're not knowledgeable about it. Focus on the raw data interpretation in terms of what you can get from it. Does the experiment you designed make sense from a protocol perspective in terms of answering the basic questions that you want to understand? Then, try to adopt best practices from the literature and following modeling approaches that have been done before you if you are not confident in terms of how to model a system or understand it deeply.
2. Talk to experts in your area
- Second, don't be shy to reach out to experts. You know, I joke when I say be nice to Malin and Kenneth, but I also mean it seriously. Reach out and talk to other experts in the field. For example, I have read a lot of Prof. Richter's papers. Reach out to experts like him and, whoever is the expert in your area, and try to make collaborations. That is how I got into this field by going to Stanford to first do research and learn from the best people there. So, I think if you're new to a field, always find the people really working at the cutting edge of what you do and try to work as a team rather than getting stuck in a silo. That will really help you get through the QCM introduction much faster.
Dr. Kellermeier:
3. Don’t look at too many things at the same time
- I can only second what Josh just said. I would make sure to not look at too many different things at the same time. Make sure that what you measure is reproducible on the one hand. And then as Josh said, focus on the primary signals, use one frequency and one dissipation set of data for a given overtone, and play with this data. Make sure that you understand what is happening before conducting a broader screening that can, in the end, turn out not to give any value if the basic assumptions at the beginning were not correct.
Prof. Richter:
4. Use controls to help you select the interpretation analysis scenario
- I fully agree with what both previous two speakers said. To make the point about the data itself, I think controls are important there as well. So, if I say try to understand your data, think of controls that enable you to discard one of the interpretation analysis scenarios, so you can confidently focusing on another one. That would be quite helpful as well.
Q: You all have slightly different approaches to QCM-D data analysis and the information you extract. How long does it take to get to know your system and figure out what information to look for?
Dr. Kellermeier:
- The rationale is a little bit different depending on whether you are an academic or an industrial researcher. In our case, we are often guided by input we get from performance tests. So, I can measure whatever I like, but if that does not translate into the properties of, for example, a detergent system in an actual cleaning test, then there is no point in looking for such effects. So, we are clearly guided by experience we have from colleagues, application tests, or other measurements, and then we pinpoint the effects which are most relevant for a realistic application scenario.
Kenneth:
- What you are seeing in QCM-D is, of course, the system. And the system will be the instrument, your sensor, and your solution. You must have very good control over all components because it's a super-sensitive technique. So, say, for example, that you have a protein solution on the bench. If you measure that within ten minutes, you will have some answer for the QCM-D. If you let the protein sit on the bench for one hour, you will get a second answer. You have one portion of native protein, and you have one portion of denatured protein. So, know your system before you collect the data. I think that is very important. You must know what you are looking at when you have the QCM-D signal.
Prof. Richter
- In my lab, 80% of the QCM-D data that we acquire never leaves the lab. QCM-D is really good as a quality control instrument. Following up on what Kenneth said, it tells us if our proteins are well behaved or not. If not, we go back to the purification and make better proteins before we make the next experiment. So, in that context, QCM-D is a really useful technique to check that our reagents are working the way we expected them to work. That is often just troubleshooting that never makes it into publication. Yet, it is just as important as the data that makes it into the final publication. Only when we have those things under control, can we start to ask specific questions: What numbers, what information can we get, by investigating our model? In my lab, we focus on models of glycocalyx, but it could equally well be model membranes or other things on surfaces.
Prof. Jackman
- I think it depends on what our target audience is too. I mean, in terms of how Prof. Richter looks at quality control versus how Matthias looks at performance evaluation. There are a lot of different possibilities for QCM-D, but also who is your audience? Is it your internal audience or for quality control? Is it showing that something works from a performance perspective? Is it gaining fundamental insight into the system? So, I can just go back to some of our discussions and maybe the membrane-peptide or membrane and antimicrobial lipid interactions. The same datasets, depending on who I talk to, can be presented very differently, in terms of if I'm talking to an industry partner, for example “this has x percent disruption”. If I'm analyzing the data to publish a paper on it on the other hand - “This is the structural transformation, and these are the details”. So, QCM-D has a lot of potential because there is really a lot of information you can get from it, but you can present it or analyze it in a relatively simple way. Maybe simple is not always the easiest way but it may be the most effective communication way. You can also over-analyze data and make very complex models, but maybe it does not make it easy to understand for the audience you are targeting. So, matching the audience or who you are trying to convince with the data, whether it is yourself or someone else, with the analysis approach you take is most important. But I think QCM-D has a really nice balance between a really fully qualitative, more control-based, as Prof. Richter said, positive and negative control “is it behaving as I expect”, to fully quantitative modeling, which can be described in a fundamental nature about the viscoelastic properties of a material, or as Matthias is pointing out from a performance evaluation perspective.
So, there are a lot of possibilities, but it depends on who your audience is, for example talking to a collaborator, building an industry project, writing a peer-reviewed paper, submitting a grant application or educating undergraduate students.
Concluding remarks
The versatility of QCM-D offers a spectrum of analytical possibilities, and analysis and interpretation can sometimes present challenges due to richness of information in the data. We got the opportunity to talk to four experienced QCM-D users who shared some insights and strategies on how to expedite the learning curve if you are new to QCM-D. Their advice included gaining a solid understanding of the system through literature review, seeking guidance from experts, maintaining focus on reproducibility and primary signals, utilizing controls for robust interpretation, and aligning analysis approaches with the intended audience.