Tuesday, July 30, 2013

Automation, Asiana Airlines and Addlebrains

My second job was at the largest and most automated cell culture plant in the world. This automation was heralded (at least internally) as the mechanism that will ring in a new age of fewer errors, lower costs, lower process variability and higher efficiency.

It was also there where I met Mary who relayed this paraphrased story:
My dad is an ophthalmologist and he says, "Mary... you can train a monkey can do my job 99% of the time. But you want me and not a monkey when something goes wrong that 1% of the time"
This is exactly what I learned from my days as a fermentation engineer at this uber automated cell culture plant. Automation is the perfect for handling routine jobs or tedious jobs, or even moderately complicated jobs... not so good at managing exceptions.

My 3rd week on the job, my boss and grand-boss were at the annual Process Development offsite in Lake Tahoe. Their parting words were, "Good luck, Oliver! If anything really bad happens, call Bob."

As it would happen, there was a batch feed operation (where they prep salty media, sterilize a line to an already-running production culture and pump said salty media in) where a valve was left open and only a fraction of this media was actually delivered.

What did I do? I call Bob. This was essentially Bob's response:

"Make up another batch feed and send in the right amount. What's the problem?"

The problem was that the recipes are coded with exactly one batch feed; there is no provision in the automation for a second batch feed; we could have coded a loop to allow for a second batch feed, but as none was specified in the Biologics License Agreement (BLA); therefore, there was no justification to code the loop.

We ended up using another branch of the recipe logic to complete the batchfeed but it took 3-times longer and was twice as complicated as the fully-manual batch feed.

Automation is the perfect solution 99% of the time. But there's always that 1% of the time when something goes wrong and does so spectacularly.

Asiana Airlines

asiana crash Look no farther than to the Asiana Airlines crash at San Francisco International Airport. It turns out that Asiana pilots were not pilots, they were airplane operators. These putative pilots were capable of using the software engineered into that Boeing 777; they were not actually capable of flying the plane when "everything goes to heck in a handbasket."

So what's the answer to too much automation and not enough skill?

Well, if you work for the government, the answer is: more automation. The US Federal Aviation Administration issued an edict that all non-US must use GPS systems when landing at SFO.

This bandaid is just that: a bandaid. On an immediate-term basis, there likely won't be any more deaths. Long-term, this addlebrained approach allows "airplane operators" to masquerade as "pilots" all the while not being able to fly a plane when all goes to hell in a handbasket.

Automation in Biotech Context

Biotechnology in the US is very prone to these "if some is bad, more is better" scenarios because drugs - like airplanes -  are regulated by the federal government. All too often, fixing the immediate pain comes at the expense of the long-term; this is the nature of solving engineering problems in the most politically-expedient manner. And heads of technology departments and managers of automation departments need to be particularly vigilant and resistant to the bandaid solutions that carry subtle long-term detriment.

Generally speaking, automation is good.  But be certain that your long-term strategy of using automation does not undermine mission-critical skills.  And tactically, be certain that the hows and the whys of the status-quo engineering design gets explained so that your staff can rise to the occasion and to apply those mission-critical skills.

Monday, July 29, 2013

How to Institutionalize Tribal Knowledge

Is your knowledge going to walk out the door when key old-timers begin retiring?

Herb and Bob Genentech Biotechnology was invented in the 1970's by Bob Swanson and Herb Boyer. Commercialization of this stuff (large-scale biologics manufacturing) started taking off in the 1980's.

We're closing in on 30-years since biotech manufacturing started. If you joined as a junior employee (in your 20's), you're now in your fifties. If you joined as a more senior employee (in your 30's), you're about to retire.

Chances are, a lot of your core competency is sloshing around in the brains of these folks. And when those guys are long gone, all that's left will be hard drives full of spreadsheets and documents.

How do you institutionalize tribal knowledge?

google dna logoZymergi has solved this problem for biotech companies with the help of Google. Prior to Google Docs and Google Mail for the enterprise, Google sold their search engine as a physical appliance. Customers purchase their machine, slide it into a server rack and configure it to crawl your web-enabled network drives.

(The process is iterative since there's quite a number of files on public network drives that ought not be there)

From here, you basically get an intranet site that looks and feels like Google, except now, instead of searching the internet, you're searching critical enterprise documents, powerpoint slides and PDFs.

The limitation with the out-of-the-box solution is that scope is limited to files on a network drive. Yet in an enterprise environment, your usable knowledge may be stored in back-end relational databases used by systems like SAP or TrackWise. How do you liberate the knowledge stuck behind the front-end application that has been engineered to run a business process (as opposed to serve data)?

ZST is Zymergi's answer. ZST is a web-application that does one thing: make web-pages out of relational database data. Coded in the .NET Framework, the ZST runs in the Windows environment, and when configured, can expose your relational data to search engines for indexing.

With QA discrepancy data and engineering change order data juxtaposed with scientific memos, anyone with authorized access can learn your business the way they learn other subjects: using Google and on the... intranet.

Want to see all discrepancies, change orders, process flow diagrams, campaign summaries, science memos about CD11a production culture pH?



Want to know about all contaminations at the seed train scale across the entire manufacturing network?



Managing the biologics manufacturing workforce of the future will require institutionalizing tribal knowledge.  One way to tech transfer from tribal elders to young blood is with ZST and an intranet search engine.

Thursday, July 25, 2013

Fermentation Analysis Software

There's this neat question on the Mathematical Modeling of Fermentation LinkedIn Group on software used in Fermentation.
I would like to ask about the software for the analysis of your fermentation processes. Software for analysis, but not for the fermentation control. Although, if you can say something about the control programs, it is welcome, too.

I suspect that the people in this group deal with small-scale or pilot plant-scale, but this question is actually worth answering for large-scale cell culture/fermentation.

deltav In 1999, the fermentation control software was basically free-for-all.  No single company had a stranglehold on the market. Allen-Bradley PLCs were popular, Siemen's was popular, Honeywell was a good option... But over a decade, the company that has really taken over the control layer is Emerson's DeltaV system.

The reason this is worth talking about is because the data source comes from instrument IO that is monitored by the control software. All analysis is preceded by data capture, archival and retreival. DeltaV is that software that does the capture.
1) What software is used on your fermentation equipment?
osisoft pi Next up is the system that archives this instrument data for the long-term. DeltaV has a historian, but the most popular data historian is OSIsoft's PI (a.k.a. OSI PI). And the reason is because the PI has stellar client tools and stellar support. PI client tools like DataLink and ProcessBook are good for generic process troubleshooting and support. More sophisticated analysis requires statistical programs.

Zymergi offers OSI PI consulting for biotech companies.

2) What software you prefer to analyze of your fermentations and for your future fermentation processes planning?

JMP This is where there's a lot of differentiation in fermentation analysis software. My personal fave is SAS Institute's JMP software. This is desktop stats software that lets users explore the data and tease signal from noise or truth from perception. I've solved a ton of problems and produced answers to 7-figure problems with this software.

Zymergi offers MSAT consulting helping customers set up MSAT groups and execute MSAT functions.

There are others operating in this space, but I have yet to see any vendor make headway beyond trial installation and cursory usage.
3) Do you agree with the fact that the question of software for fermentation processes doesn't undergo a rapid development now?
All of these tools are not fermentation specific.  They each are superior in their respective categories:

  • DeltaV is a superior control system
  • OSI PI is a superior data historian
  • JMP is a superior data analysis software
Where there is a gap, fermentation analysis is how to link upstream factors to downstream responses.

Tuesday, July 23, 2013

Genentech: Beware Lepto Contamination

A year ago on July 19th, 2012, Genentech VP of Biologics Quality (Anders Vinther) presented "A Novel Bacterial Contamination in Cell Culture Manufacturing" at the West Coast Chapter of the Parenteral Drug Association. (This same presentation was likely made elsewhere, but the only "Google-able" mention of it was on the WCC PDA website).

Most biotech/pharma companies are tight-lipped about their biologics manufacturing process problems.
  1. For one, they run proprietary processes: it's none of our business.
  2. For two, Obamacare forces the FDA to approve a regulatory pathway for biosimilars: why share these growing pains with competitors who seek to eat their marketshare?
  3. For three, why air out dirty laundry?
So when Genentech came forward with a very detailed presentation on bioreactor contamination and a prescription for how the rest of the biotech industry to handle this specific type of contamination, it's worth paying attention.

Their summary of events goes like this:
  • Visual examination of cell culture indicates contamination of seed cultures
  • Gram stain shows no bacteria
  • 5-day incubation with standard plate count shows no growth
  • No signs of contamination by looking at dO2, pH trends
  • No evidence of contamination from standard QC testing methods
This contamination is the black swan event. Never in the history of cell culture manufacturing has anyone encountered a microbe that isn't detectable with standard methods.

Leptospira

leptospira under microscope After a second seed bioreactor contamination, Genentech was able to cultivate the bug and identify it as Leptospira. Leptospira is a coiled/spiral that survives in soil and water. It is motile, slow-growing obligate aerobe that favors liquid environments.

But the characteristics relevant to biologics manufacturing are:
  • 0.1 micron in diameter - CAN PASS THROUGH 0.1 micron filters!
  • Non-spore former - Not heat resistant
  • Requires long-chained fatty-acid - Will not grow in media alone, requires presence of CHO cells
The remainder of the slides go through their root cause analysis and contamination investigation as well as global risk assessment (i.e. "CYA"). And it's certainly worth a gander.

For us, we would be wise to learn their lessons, which are:

  • There's a bug out there that passes through 0.1 micron filters: L. licerasiae
  • This bug (and related bugs) are not detectable with standard methods, so LOOK at your cultures!
  • Update control strategies (consider heat-treatment and other barriers)
Battling contaminations is bad enough.  Now there's a bug out there that can get by sterile media filters and cannot be detected by no other method than by putting your eyeballs on it.

Monday, July 22, 2013

Why Is There Antibiotic in Cell Culture Media?

Answer: Antibiotic is there to kill microbes, if any.

That's the short-answer.

Question: Why would there microbes in the cell culture?

Answer: Poor aseptic practices.

So if you have good aseptic procedure, then you ought not have antibiotics in the media, right?

Answer: Right, but we still keep it in just in case.

This type of thinking pervades large-scale cell culture. And ironically, it's backwards.

I'm sure there are smart ways of using antibiotics in cell culture, but I'm not aware of any.

Antibiotic Resistance

antibiotic gentamicin
Molecular structure of Gentamicin
The first problem with antibiotics is that they select for resistance: over time, the organisms that are susceptible to antibiotics die off leaving the antibiotic resistant organisms to remain. These organisms may become slow-growing, low-level contaminations that are difficult to detect.

Detection

The second problem with antibiotics is that they interfere with detection. QC Microbiology tells us their tests are sensitive to 1 colony-forming unit (CFU) in the sample, which means if there is 1 CFU in a 40mL sample bottle, they'll find it. But 1 CFU/40mL is 25 CFU/L... and for a 12,000L bioreactor, you need a contamination of ~300,000 CFUs in order for QC to detect contamination.

The sooner the culture hits 300,000 CFUs (assuming you have a 12kL), the sooner you know there's a problem, and antibiotics slow things down.

Poor Aseptic Technique

The third problem with antibiotics is that it lets you get away with sloppy procedure. Back in 1973, M.F. Barile's study of mycoplasma contamination in cell culture found that 72% of cultures grown continuously in antibiotics were contaminated versus 7% of cultures without antibiotics. The conclusion was that over-reliance on antibiotics leads to poor aseptic technique.

Happily, we are coming across more and more customers who don't use antibiotics at the production scale, and generally speaking, these contamination investigations are mercifully straightforward.

Those other guys have long weeks of meetings ahead of them.

See also:




Friday, July 19, 2013

What IS Peptone Anyway?

According to the free dictionary,

pep·tone
n.
Any of various water-soluble protein derivatives obtained by partial hydrolysis of a protein by an acid or enzyme during digestion and used in culture media in bacteriology.

Question: what is the source of protein?

Answer: Do you ever wonder what happens to the parts of the animal that humans DON'T eat or use?

Peptone vendors will get the animal scraps and make peptone by "dissolving" them with acid or digesting them with enzymes and eventually make them into a powder which gets sold to cell culture manufacturers who use them.

I know of bovine- and porcine-derived peptones... (aka "beef" or "pork").  With the mad cow scare from several years back, Process Development departments were moving away from bovine- to porcine.  And since then processes that use peptones have tried to move towards non-animal derived (aka "veggie") peptone.

As I've said before, peptone is that je ne sai quoi that the cells like and boosts their productivity.  A process development department that continues to use peptones do so at the risk of increasing manufacturing variability in favor of higher small-scale cell culture productivity.

And doing so risks making the process susceptible to peptone lot variability, which ultimately diminishes process robustness.

tl:dr - peptone is dissolved cow/pig/veggie parts ground into fine powder used by some biologics manufacturers to increase cell culture productivity.

Thursday, July 18, 2013

The Peptone Is NOT Your Root Cause

Cell culture volumetric productivities are low and it's not looking like this campaign is going to make enough product. The Manufacturing Sciences team is on task and they suspect that the new peptone lot is root cause for the low titer but it's hard to say.

And it's hard to say because the one lot of peptone goes into many batches in varying amounts and given the multivariate nature of biological processes, there may be too many interactions to simplify this down to a single variable like peptone lots... but I digress.

Looking at peptone lots vs. cell culture performance is a waste of time unless you are willing to do more than just accept the peptone manufacturer's CofA here's why:

There are two outcomes of this analysis:

Outcome #1: Peptone lots do not correlate with cell culture performance. Which means the root cause of low volumetric productivity lies elsewhere.

Outcome #2: Peptone lots DO correlate with cell culture performance, but given the process is licensed with peptones and given that status-quo peptone vendors are already qualified, there's no obvious action to take (other than maintain the status-quo).

In programming, this logic would look something like:

if ( peptoneLotsImpactCellCulture ) {
  doNothing();
} else {
  doNothing();
}


See how the if-then statement is unnecessary? The same goes for analyzing whether or not peptone lots correlate with cell culture performance: the analysis is unnecessary.

Spending resources to solve a problem whose answer does not change what you are going to do is waste.

That said, what if peptone lots happen to diminish cell culture performance? That may be the case, but just because peptone lots diminish cell culture performance does not mean that peptone lots are the root cause. The true root cause in this case is the cell culture process' susceptibility to peptone lots,  or even the use of peptones in the first place. See blog post, "If Kryptonite is the Root Cause, What's the CAPA?"

Waste reduction and variability reduction are core objectives in manufacturing management. And waste reduction includes cutting out the wild-goose-chases for causes that are not in your control.

Wednesday, July 10, 2013

OSI PI Historian Software is Not only for Compliance

In October 1999, I joined a soon-to-be licensed biotech facility as Associate (Fermentation) Engineer. They had just got done solving some tough large-scale cell culture start-up problems and were on their way to getting FDA licensure (which happened in April 2000).

As the Manufacturing Sciences crew hadn't yet bulked up to support full-time commercial operations, there were 4 individuals from Process Sciences supporting the inoculum and production stages.

My job was to take over for these 4 individuals so they could resume their Process Science duties. And it's safe to say that taking over for 4 individuals would've not been possible were it not for the PI Historian.

The control system had an embedded PI system with diminished functionality: its primary goal in life was to serve trend data to HMIs. And because this was a GMP facility and because this embedded PI was an element of the validated system, the more access restrictions you could place on the embedded PI, the better it is for GMP and compliance.

Restricting access to process trends is good for GMP, but very bad for immediate-term process troubleshooting and long-term process understanding, thus Automation had created corporate PI: a full-functioned PI server on the corporate network that would handle data requests from the general cGMP citizen without impacting the control system.

Back in the early-2000's, this corporate PI system was not validated... and it didn't need to be as it was not used to make GMP forward-processing decisions.

If you think about it: PI is a historian. In addition to capturing real-time data, it primarily serves up historical data from the PI Archives. Making process decisions involves real-time, data, which was available from the validated embedded PI system viewed from the HMI.

Nonetheless, the powers that be moved towards a validating the corporate PI system, which appears to be the standard as of the late-2000's. 

Today, the success for PI system installations in the biotech/pharma sector is measured by how flawlessly the IQ and OQ documents were executed.   Little consideration is really given to the usability of the system in terms of solving process issues or Manufacturing Sciences efficiency until bioreactor sterility issues come knocking and executive heads start rolling over microbial contamination.

Most PI installations I run into try to solve the compliance problem, not a manufacturing problem, and I think this largely the case because automation engineers have been sucked into the CYA-focus rather than value-focus of this process information:
  • Trends are created with "whatever" pen colors.  
  • Tags are named the same as the instrumenttag that came from the control system.  
  • Tag descriptors don't follow a nomenclature
  • Data compression settings do not reflect reality...
  • PI Batch/EventFrames is not deployed
  • PI ModuleDB/ AF is minimally configured
The key to efficiencies that allow 1 Associate Engineer to take over the process monitoring and troubleshooting duties of 4 seasoned PD scientists/engineers lie precisely having a lot of freedom in using and improving the PI Historian.  

If said freedom is not palatable to the QA folks (despite the fact that hundreds of lots were compliantly released when manufacturing plants allowed the use of unvalidated PI data for non-GMP decision), the answer is to bring process troubleshooters and data scientists in on at the system specification phase of your automation implementation.

If your process troubleshooters don't know what to ask for upfront, there are seasoned consultants with years of experience that you can bring onto your team to help.

Let's be clear: I'm not downplaying the value of a validated PI system; I'm saying to get user input on system design upfront.

Wednesday, July 3, 2013

Continuous Improvement of Bioreactor Sterility

In a lot of bioreactor contamination investigations, the root cause is never found, that is: the cause of bioreactor contamination is not conclusively determined.

This is quite disappointing in a cGMP environment because the way problems get fixed is that you find the root cause, the technical folk propose a solution, you write it up in a CAPA and push it through the "change implementation team" and you never have to deal with that problem again.

But if root cause is never found, there is no corrective action; there certainly is no preventative action and the chain reaction of cGMP improvement never takes place.

mythbusters plausibleOne reason the true the proverbial smoking gun is never found is because there are too many other "smoldering guns" at the crime scene. As they say of a theory that cannot be confirmed, but also cannot be denied on Discovery Channel's show Mythbusters, "It's plausible."

One reason these plausible sterility risks exist was because you didn't know about them. (if so, call me).

Another reason these plausible sterility risks exist was because they weren't worth fixing because the system wasn't "broke." You had bigger fish to fry and it's hard justifying that precious budget dollars should be spent on a system that was "working" fine.

That reasoning works until you get back-to-back contaminations and your biotech manufacturing plant is perceived to be out-of-control.

Now, you're staring down a laundry list of potential causes, all of which are plausible, many of the solvable, none of which you can cross off your list as the true root cause.

Which gets me to the point of this missive. You're not going to be mired in contamination for your entire career. You're going to have periods of success. All the sterility risks you can address during that time will ensure that your periods of failure in the future are short-lived.