Better Biopharma

Aligning Upstream And Downstream Development (Part 1)

Tyler Menichiello

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 30:10

In this episode of “Better Biopharma,” panelists from the Bioprocess Online Live event “Optimizing Process Development Through Upstream And Downstream Integration” continue their conversation about how upstream and downstream development teams can work together to accelerate timelines and reduce risks and bottlenecks in process development.

Keep up with everything biomanufacturing at Bioprocess Online: https://www.bioprocessonline.com/

Follow Tyler Menichiello on LinkedIn: https://www.linkedin.com/in/tmenichiello/

Subscribe to our newsletters: https://www.bioprocessonline.com/user/edit/subscriptions

Tell us how we can improve. Take our 60-second survey: https://docs.google.com/forms/d/e/1FAIpQLSfZGJluCmEhm1MBQvcm7_gtBEXuTx5v42LJ1mx9bCUKo8fwfg/viewform?usp=header

SPEAKER_01

Hello and welcome back to Better Biopharma, the official podcast of Bioprocess Online. I'm your host, Tyler Minachello, and on this episode, we're going to break the mold a little bit. You see, back in February, I moderated a Bioprocess Online live event panel with five industry experts who discussed how to optimize process development through upstream and downstream integration. That conversation was so fruitful and engaging that by the time the hour was up, it felt like we'd only just begun. So, in a first for Bioprocess Online and Better Biopharma, we decided to continue that conversation right here on the podcast. This is part one of that conversation. Joining me for the second time on the show is Eric Dewar from Sonofi, where he is the technical lead for drug substance manufacturing science and technology. Eric is joined by Mark Fitchman, founder, president, and CEO of the CMC consulting group SomaTech. Doug McDonald, the director of purification process development at Immunome, an ADC company, also joins us, as well as Dr. Brandon Salinas, the VP Process Sciences at Omoja Biopharma, a company pioneering in Vivo Carti. Unfortunately, our fifth panelist, Raj Prabhu Vijayakamar Saraswati, who is the director of biologics process development at Rap Therapeutics, was enabled to join us for this recording. So, Doug, Brandon, Eric, Mark, thank you so much for joining me on this episode of Better Biopharma and for being here to continue the conversation on upstream and downstream integration and process development. To start off, I really just kind of wanted to see if there were any points you guys wanted to revisit from the live event that we could kind of flesh out more now that we're unbound by a live format.

SPEAKER_02

Well, so one of the things I was kind of thinking, you know, a lot of the uh a lot of the uh the you know the live version, um, I don't know, it was a little bit of a um a love fest. I I think we all agreed with each other on most things. And um, I think it's always more interesting when we don't agree. And at the very end, I think there was some disagreement about when and how to you know do temporary locks and and what their value was. Unfortunately, I think it was Raj who was um who was kind of taking the contrary position. So it'd be good to have him back here for that.

SPEAKER_01

Other process locks you're saying there was some disagreement?

SPEAKER_02

Yeah, and I by the way, I think the disagreement um probably wasn't really a disagreement, it was a um maybe a slight discordance with respect to the language we were using. But you know, I always think not all being on the same page is always more interesting.

SPEAKER_01

Yeah. Yeah, diversity of opinion, right? Diversity is always more more interesting. Um, so yeah, we're not gonna rehash the full live event here, but there are some points that we can we wanted to revisit or maybe digging more into that we didn't have time to uh when we were live for that hour. So we spoke a bit about where we draw the line between upstream and downstream and how modality and and company type affect that and impact that. And we spoke a little bit about the bad things that can happen when upstream and downstream, the left hand and the right hand aren't talking to each other. And so I kind of wanted to revisit that a bit and and see if if if anybody wants to go first. I want to get your thoughts on where there are the biggest opportunities to improve coordination and communication between upstream and downstream teams as you see it, whether that's at your companies or in your time.

SPEAKER_04

Yeah, I'd like to go here. I I think one way we can achieve that is by co-locating um the offices and the labs and how we organize people by intermingling them as much as possible. Um, so in some facilities, and you may be designing them de novo, consider this from the beginning. Don't separate your labs on different floors if you can avoid it, right? Have the downstream folks in the same area if you can, particularly if you're talking more like manufacturing sciences kind of functions. Um, I would keep them close together, intermingle the seeding between all of the collaborative groups, analytical drug product, upstream and downstream. And I think that some of the things that this helps with is avoiding like data interpretation and silos, avoiding brainstorming in silos, where another group might have a solution to your problem, but if you're not talking to them on a regular basis, you're not gonna know that and you're gonna try to to solve it in potentially a counterproductive way. So, really just as as much as you can think about the physical uh layout in your facility, what constraints you have, and how much more you could um commingle the different groups.

SPEAKER_02

Yeah. Thank you for that. I I I want to kind of I'm completely agreeing. Just want to add one thing. You made the comment that you know, maybe you don't know the other group has a solution to your problem. I think a lot of times they don't know they have a solution to your problem. And you're chatting, they're like, oh yeah, we know how to do that, but they wouldn't even know to tell you if you're if you're not having that interaction.

SPEAKER_01

Yeah. Can you guys think of specific instances uh that come to mind, like whether it was at your current companies or in the past, specific problems that you think would have been easier if folks were not siloed up and they could have just communicated to one another?

SPEAKER_00

Yeah, I think I can share man on that. Oh, go ahead then. Yeah. Well, just uh just real quick to add to the Love Fest. Uh, I agree with proximity. Um, where where I am now, we're we're a startup, but it's we're a little more advanced, but still considered a startup. And I have the head of analyticals sits right next to me, a head of upstream, sell line. We're all like together. And I feel like I've learned more about these other functional areas in the last year here at Immunome versus all my other years uh combined at like larger companies. Um so proximity is pretty important and and CMC team representation. Um, I feel like, you know, a lot of times to be a CMC lead, you know, people are looking for a lot of experience and you know, where you've managed uh maybe multiple functions or at least had a lot of exposure to it. But bring the people who are doing the work to the CMC meetings and let them give the updates. And then that's one way for you know for the specialists per se to get the exposure to the other departments and and what their needs are. So I feel like you know that that's really helpful. Um so yeah, that was just to kind of tag onto the last question. But but for this one, um, you know, I think I might have mentioned it in the in the and the other panel. Um, you know, we get data packages back from CDMOs. We don't do the work here. Uh, we're building laboratory capability, but we don't have it yet. And, you know, it's limited at times. You get tables, right, of of SEC results, charge variants, things like that. And you and you make decisions on, okay, that process condition looks good. You know, I think we'll stick with this and and we'll move forward. But then an analytical person who's like more attuned to seeing the minute, you know, the minutia of these things will take a look at the chromatograms themselves and see potential red flags or trends. Like uh, for example, in a non-reduced CE, where you could see small peaks that maybe a process person wouldn't think are anything of concerning because the purity is still over 90% or whatever. And they're seeing that, well, there's there's reduction happening here. There's some clipping or mab reduction that we should keep track of. And you dig in deeper and you see it happening. Whereas when you're just getting a table right from from your CDMO, it's not necessarily telling you the entire story. So um, and I've seen that happen several times. Um, so and it's trained me to try to be better at looking at the more detailed data sets or ask for it if it's not provided.

SPEAKER_03

To piggyback on that for both Brandon and Doug's um points there, what's really important is ensuring that both your discrete and your continuous data sets are a continued package and interpretation with each other. You know, as we go through different stages of our process validation and life cycle, it's very easy to add or remove um specific tests or assays as we're going through, but we need to remember what those assays are telling us. And exactly what Doug was saying there with respect to seeing the minutiae of uh of a particular assay and interpreting that against how a TFF is performing um over time and the lifetime of that system as an example. These are potentially early indicators that uh give way to how our process is performing and and how robust our process is. So whether we whether we capture that and have a robust in-house uh monitoring program, or we we marry that with an appropriate uh AI-based tool or methodology to kind of uh train uh a specific LLM to capture these menu shape. These are the important points of just making sure that your discrete continuous data sets really speak to each other, um, and especially between two different functional groups.

SPEAKER_04

I'd like to piggyback off of that one, Eric. I think structurally setting that up so it's easy for your uh IT systems, your informatics systems, and you're not using different software on the upstream and downstream team to interpret data. You're not uh you're not using the same kind of historian. Um, you're using the same kind of structures for your laboratory notebook entries. So mining the data historically is easy for somebody, whether whether they're in the upstream or downstream group. So you can kind of keep that historical knowledge from all those old assays that you don't run anymore, um, accessible. Yeah.

SPEAKER_02

I want to start, I want to circle back to something that uh Doug had mentioned. I think you I think you were referring to an electropharogram, uh, you know, small peaks. But um, you know, everybody's busy and and people want to basically say, you know, just just tell me what the recovery was, tell me what the purity was, right? And and you've got to, right? You can't, you can't, everybody can't drill into everybody else's work. But this, I think there's not enough. Um I think there's too much just tell me, and not enough, and periodically would you show me, right? Because you know, a lot of our data um in the form we use it is just a number, but in the form that we interpret it, it it's a picture, you know, whether or not it's like a ferrogram or a um or mass spec data or or you know a chromatogram, right? I I think um when everybody's busy, nobody really wants to look at the data, and you can't, but you can you can't do it all the time, but you can do it sometimes. And you know, and a lot of times you've got you know, maybe a junior person uh taking data off, you know, an HPLC, you know, taking data off the HPLC equipment and reporting numbers and not really maybe noticing that um you know the baseline's noisier than it was last week. Getting back to the example that you know Doug was saying. I think that's where some missteps that are gonna hurt you in a few weeks or a few months could be caught right then.

SPEAKER_01

Yeah, thank you guys. That was uh those are all great answers. And Brandon, your point something I never really thought about before was the fact that uh the possibility of upstream and downstream teams using different data capture uh in your guys' experience, is that a prevalent issue or is that just kind of a tale of uh a cautionary tale?

SPEAKER_02

No, no, all the time, all the time. It's you know, I mean, downstream people look at these tables that upstream people generate and it looks like gibberish to them, you know, and vice versa. And you know, it it it by the way, not only does it make it hard to maybe catch something, but it also makes it hard to empathize with what the other team is struggling with.

SPEAKER_03

Yeah, it's a that's a good point too, because a lot of our early stage process product development equipment is really only designed for largely an upstream only, downstream only interpretation capability. It's rare that you actually see uh a device or equipment or a system that has that active fridge. So intrinsically, there's a bias in how these teams perform because they're receiving different um service opportunities and vendors uh who are looking for different metrics as well. So there's there's one, there's a need to standardize the equipment or software. And two, there's also a need to understand what are the important metrics or your your CPPs, KPPs, CQAs, KPAs of of each group. So I think I think both those aspects definitely come into play too.

SPEAKER_04

The whole range of problems. Uh, it can be as simple as the downstream teams using Saitiva equipment and the upstream team is using Sartorius equipment, right? And you don't have a historian that can kind of put them in the same language. Um, but there's also simple solutions like my your upstream team uses Prism and your downstream team uses SAS Jump, like fix that, right? That's an easy one, right? Uh so there, so there's opportunities, there's uh for wins there to kind of align uh teams. Another one is how you code things. If you're using, I think most companies are using electronic laboratory notebooks now. There's so much potential there. If you code everything in a similar way, um, and if you the metadata that you associate with the the notebook entries is done in a similar way across all of your groups, then it's far more interpretable and understandable and mineable by everyone. Yeah, yep.

SPEAKER_00

Yeah, I got that notebook part. I I I strongly advocate for building these workflows that connect, you know, even just your sample through the whole process, your test condition. Because, you know, just learning from the knowledge of the past, prior knowledge is so important for moving faster and doing more without um spending so much money and hiring so many people. And I think at my previous role, you know, we're we're just getting started with that here. But at my previous role, you know, I was at Seattle Genetics, there were monumental efforts made to link everything together and try to learn from it. And then when Pfizer comes in and acquires them and they've had 500 programs, they don't have that system in place, which is just remarkable, you know, that they have a wealth of data where they probably wouldn't even need to do experiments anymore, you know, for an IND, but they're not using it. You know, it's just crazy. Yeah, I guess that's why pharma doesn't typically innovate like that, whereas biotech does.

SPEAKER_02

Uh I like Brandon's point about mineable data. It's um, you know, things go wrong, they're gonna go, they're always gonna go wrong. And anything that makes looking into it easier, um, because you know, that you can think you sink a lot of time and money into you know, scratching your head and saying, what changed?

SPEAKER_01

Yeah, I'm curious to what degree is it fall to the the companies that are the upst the upstream and downstream teams respectively, or is it I mean, uh Brandy, but in your example before you mentioned Citiva and Sartorius as two different kinds of equipment vendors and and software. Is it is it a lack of comprehensive, like continuous end-to-end equipment and and software solutions there, or is it just that some have upstream strengths, some have downstream tank strengths, and they they gravitate, those teams gravitate respectively towards just different different priorities?

SPEAKER_04

Most mostly comes down to being proactive about it, right? From the beginning. Um, how are we going to use the data when we're acquiring these systems? Or how are we, you know, are we investing in our informatics team and our data engineering teams to make it make these tools usable? There's lots of there's so many out there now that you can use that can automate how you uh analyze data that can integrate time-based continuous data from multiple different machines and vendors into one system. So you can easily overlay all of the programming. Now, even with AI being able to get pretty good at coding, you can you can really, you know, we took a a real headache of a problem, and over a weekend with AI coding, we and we now have a pretty streamlined solution for how to uh collect, analyze, and present data. So that there's a ton of potential there. You just have to be proactive about it instead of reactive at the end when you now have 20 different systems people are using and trying to uh share data between their groups.

SPEAKER_01

Yeah. Thank you, Brandon. And that raises a question too, which which you guys touched on before, uh, the importance of mineable data. And so if we could go around the horn here, what would be the most actionable, concrete advice you would give to your peers out there on like how do you ensure that the data you're collecting upstream, downstream, in the middle is mineable, is is going to be helpful to everybody that's touching it. You know, what's what are some principles there that or some things you guys have learned the hard way that you'd like to show you?

SPEAKER_02

Well, I think firstly, collect it. I I think this, you know, that that's a good start.

SPEAKER_01

That'll help. That'll help for sure.

SPEAKER_02

I'm I don't I don't think I have anything too intelligent to say after that because it gets it gets more thorny, but I I think a lot of the problems that I've encountered over the years is you know, it it wasn't it it wasn't it it wasn't codified in any way. Right? People made an observation, and then whatever, four months later, you're saying, you know, well, Bill remembers, you know, that's not mineable.

SPEAKER_00

Yeah, and even in the way you collect it, you know, like somebody, you know, we mentioned the the upstream data, and they just they generate these massive spreadsheets, and then they're plotting, you know, metabolites or whatever they're interested in. But a lot of the times that's not statistical forward use ready, right? Like so it, you know, when you get data from CDMOs or maybe even internally, it would be good to have your CDMOs work with you on at least exporting their data sets to you in a way where you can plug it right into Jump, for example, or which just seems industry standard in a lot of ways. Um, but everybody has their preference or their expertise. And so, you know, and then that all starts with ELN notebooks as well, right? You don't want to just put data in there just to get it in there. It has to be extractable and in a format that you can maybe do some analysis on and actually learn.

SPEAKER_02

Yeah, and some rudimentary reduction, um, you know, right at the get-go. You see, you know, you're not gonna throw away your your raw data, but it's it's nice to be able to get it to the point where a human can read it, you know, and and look for things.

SPEAKER_01

Yeah. That's great, guys. Thank you. Thank you. Um I did there was a question that I had in our in the question set for our event that unfortunately we didn't really have time to cover, that I was hoping to hear thoughts on this time around. And that is at the leadership level, where do you guys feel that early decisions should center? And pulling on that further, do you think equally upstream and downstream should be brought into those early conversations? And how early, if so, or is it one or the other? Because I've I think I've heard that upstream is is tends to be engaged earlier, and I've spent some time on the R Biotech subreddit. I've seen some downstream scientists who have things to say about that. They have some strong opinions, um, where they kind of feel like second fiddle in some ways to to upstream. And so I'm curious to hear your thoughts on that. I hope that that's an understandable question. I'll throw it out there to uh Brandon. You were nodding your head. I'm curious to hear your thoughts.

SPEAKER_04

Sure. Yeah, maybe I I'll go back a little bit um to to the business drivers and the goals there, right? Most likely your goal is something like get to process lock or get to talk slot or get to tech transfer to a to a CMO, right? And that's most of the time the business goals are not specific to upstream or downstream, right? So we need the overall CMC team to deliver X by X time. So keeping it focused that way, right? I think you have to have both upstream and downstream amongst all the other stakeholders involved at an equal measure from the beginning. Uh uh if you start with a a particular goal, um, whether that might lend itself to be an upstream focus, let's say yield or tighter or something like that, um, and you risk um, then you risk having your goals a little bit too siloed where one team might feel punished because another team was able to achieve their goals. So if downstream has a purity goal and upstream's the goal is tighter, right? Those two are often in tension with each other, and one achieving one goal may preclude achieving the other. So you have to include them both together from the beginning, and it needs to be based on a business driver.

SPEAKER_02

Yeah, but yeah, I'd expand on that a little bit and maybe even disagree a little bit. I I realize we've moved away from your uh question about um, you know, kind of leadership decisions. Oh, yeah, sorry about that. But with respect, but with respect to um upstream and downstream coordination, um yeah, I spend time in both worlds, and I would say, you know, kind of stereotypically, downstream people tend to complain. That the upstream people won't give them something consistent, um, you know, so they're kind of whining, and the upstream people are not very sensitive to the fact that the downstream people actually have something to whine about. And um what I really think in most cases is this is is the solution because you want upstream to keep optimizing. That's going to really impact process economics. But if they could periodically lock and then give downstream to the best of upstream's ability, a non-changing feed stream until they break the lock and relock so that you know, up so that downstream doesn't have to deal with something different every week. You know, they've got something they can rely on for three or four weeks, and then you know, they may not love it, but you change their feed stream because upstream has made progress. Um, and and I think that's a a compromise. It, you know, it's it nobody's doesn't suit anybody perfect, but at least uh downstream isn't being confused about whether or not the thing they saw change was their change or upstreams, uh, right. So that helps them. And you know, and and upstream gets to keep moving, but they pay the price of having to deliver something consistently, chewing up some of their resources. Um, I I think that's in most cases the only way to really the the only I I guess the least bad balance you're gonna get.

SPEAKER_03

Yeah, Mark. Coming back to the leadership question, I think it's really identifying and putting in place those individuals who have robust upstream and downstream experience to speak to both sides of the process really at the at your center point of contact where it matters most. Because this individual should first and foremost be in that cross-translational um person to allow upstream to understand why downstream is picky and for downstream to understand why upstream uh is is laxadaisical with their with their um consistency. But in the future state, also create that next generation of leaders who are flexible between upstream and downstream. So it's not only about having that right leader there at the right time, it's also about creating workflows, training programs, and cross-collaboration between upstream and downstream teams to get that training, to get that understanding, to get that experience. Because having that flexibility of being able to pivot between those two teams and really create a more solid midstream in between, I think is pretty key and one of the main successful indicators of a strong working end-to-end process.

SPEAKER_04

One possible way to structure it to Mark and Eric's point so you can balance it, is to have upstream folks that are dedicated to material generation. Maybe you do most of your experiments, experimentation and development at a smaller scale, and your downstream and drug product teams would like lots of material to work with. So maybe you have a larger scale that's purely material generation to support uh of the lock or the most recent process version to supply your downstream and drug product teams for their uh goals and their improvements. So then you're not it's not quite as much of a tension between the development team having to stop their development work to go generate material for the other groups. You have a group that's set up with that expectation that the they on some frequency are doing just generation runs to support the downstream team.

SPEAKER_00

Thank you, Brandon. I've seen that work very well in in kind of medium-sized companies where you have this material generation team, because that is a huge time-consuming effort. Um, and so I I guess I would answer this question by um you know, taking a more, I guess, high-level look at, you know, everybody wants to optimize their process and deliver this beautiful optimized process and product. But at what point, you know, for what stage you're at, do you generate metrics that are getting to the clinic faster? You know, like maybe what's when when is the it's good enough platform, you know, versus its commercializable process decided? You know, where where can we have metrics like instead of just titer or downstream yield, something like how much does it cost per kilogram of antibody, for example, to make this, you know, is how many cycles does it take for for these expensive resins on the purification part, you know, versus prioritizing titer? And I know that would take a lot more input from clinical, which you don't always have. Um, but just when it's kind of like crunch time to get to the clinic, and you know, I think our philosophy and what I've kind of been trained to develop is more shots on goal is better, you know, like get to the clinic as fast as possible. And platforms become critical, right? Because if if nine out of ten clinical trials don't get past phase one, well, you want more shots on goal, right? Versus uh putting all your effort into the most beautiful process you have, and then it doesn't come back for later stage, anyhow.

SPEAKER_02

And I'm gonna say that a slightly different way. I think it's easy for a lot of members of you know the upstream and downstream and even manufacturing teams to fail to think that the delivery date is a key performance indicator for the team, not just the quality of the product and the cost of the product, but the day it arrives. And you know, and and by the way, that's one of the things that stresses everybody out, right? Because you don't wanna you don't want to think the calendar, you know, the the position on the calendar is one of the performance indicators of your team, but it it really is. Yeah. You um, Brandon, you mentioned uh, I think uh the only mention I heard of uh drug product. Um, you know, we we're we're kind of neglecting those folks. Uh they're downstream of all of this. Yeah. And uh to the extent that um uh to the extent that you know downstream people are worried that upstream people aren't giving them something consistent, well, the drug product people have that have that problem cubed, right?

SPEAKER_04

Yeah, probably it's probably me expanding the scope of the uh uh of the top of the podcast topic here, upstream and downstream. But yeah, just from remembering, you know, that there's a ton of stakeholders here. Um and analytical, you might even consider them downstream of drug product, right? And quality downstream of analytical. So there's oh there's almost always a customer on the back end though you you should try to include.

SPEAKER_02

Hey, so Tyler, can you uh like double the uh scope of the uh discussion panel next time and the next formulation people here?

SPEAKER_01

Ten people on one panel, what could go wrong, right? I would love, I would I mean, I would love to have uh more voices and and and functional roles present on these on these panels. And it might be worth revisiting this topic next year with different different flavors of expertise, right?

SPEAKER_02

As you as you mentioned, Mark, that could be Don't forget Don't forget the account.

SPEAKER_01

No, of course. Yeah, we'll get them involved there. Thank you so much for listening to this episode of Better Biopharma, the official podcast of Bioprocess Online. We'll be releasing part two of this conversation with Eric, Mark, Doug, and Brandon soon. So stay tuned for that, and we'll see you next time.