Stephen Puryear

Stephen Puryear's Research Papers And Resources

© All Rights Reserved to Stephen Puryear for all information on this website

I have broken down all the research I have done myself or read by others and found interesting into two groups. The first group that I have collected and linked on this page consists of a very wide variety of topics in which I became interested enough to read or write about them. They are here because I hope that others in biotech or Facilities Maintenance have found their thoughts running along similar lines.

The other research is more focused because it revolves around an easily stated technical problem: What measurements would we make on any of the variables involved in a refrigeration system if we wanted to predict that the system was getting less healthy or trending toward system failure? All those materials are under the platform tab on this site.

"Behold ! Estimating Measurement Uncertainty"

I have been revisiting the ideas that make up Measurement Uncertainty (MU) and deepening my understanding. I do not consider myself a metrologist but I have done a whole lot of calibrations in my career. From the start, it always seemed to me that MU's link to Heisenberg's Uncertainty Principle is much stronger than just the fact that they have a word in common. To me, the "uncertainty" in estimates of measurement uncertainty is directly linked to Heisenbergs' Uncertainty Principle. This is the concept that drove Einstein nuts.

Recently I was re-reading QED The Strange Theory of Light and Matter by Richard Feynman. He speaks to this topic and says: "I would like to put the "uncertainty principle" in it's historical perspective: When the revolutionary ideas of quantum physics were first coming out, people tried to understand them in term of old-fashioned ideas(such as, light goes in straight line)...If you get rid of all the old-fashioned ideas and instead use the ideas that I am explaining in these lectures(that quantum phenomena should be considered probabilistically)-there is no need for an uncertainty principle!"

The same can be said of any measurement and measurement system. If we kept track of the uncertainties, there would be little work needed to summarize an estimate of their total at the time that any particular measurement was made.

I have put together some training materials for anyone interested in learning about this topic. I have always found this area to be a fascinating combination of the concrete world of measurement and the invisible world of statistics.

The first slides in the training packet explain and illustrate some very basic terms such as average, standard deviation, root sum square, distributions, and degrees of freedom. I show how they work and they are used to assemble an Estimate of Measurement Uncertainty. The next group of slides describes the necessary elements that make up all Estimates. In the third group of slides, I take you through a simple example of how to do an actual estimate of measurement uncertainty with as much fat trimmed away as possible. In the final group of slides, I touch on a few more advanced topics. These include the "GUM", the problem of correlation, and how to use the "Welch-Satterthwaite formula" to combine degrees of freedom. If you enjoy it half as much as I did making it, I will consider that a huge success.

"Global Trends"

The oldest artifact on this site is an article which was written as a critique of the pharmaceutical industry and its' uneasy relationship with its' regulator. This was a snapshot taken as things stood in the mid- to late 1990's. Global Trend, Needs, Issues was written by Robert Kieffer and published in July 1998. More specifically, his take on process optimization, risk analysis and tools such as FMEA are just as timely today as the day he published. For example, it was not until about 8 years later that the concept of parametric release mentioned here become one of the bases for the FDA's PAT guidelines.

Using Monte Carlo Simulation to validate calibration decisions

Even after I sat for the engineering exam and got a license, I still spent a lot of time supporting the Chiron/Novartis calibration function. Later, I had the task of explaining all things calibration to QA counterparts after any standard, sensor or instrument was found to be out of tolerance during a calibration. One good question they did ask was:why do we try to have standards that are 4 times better than the equipment to which we apply them? I wrote Creating and Validating a Calibration Quality Metric after using Monte Carlo simulation to answer that for myself. These virtual calibrations were performed using standards that had a range of superiorities over the device being tested that were as large as 10:1 all the way down to 1:1. Then I added a subsequent out of tolerance factor to each standard. In other words, what would the impact be if, after the calibration was performed, the standard came back from its own calibration with a notice that the standard had been found out of tolerance? How much superiority on the part of the standard was enough to protect the field calibration from having to be redone? The answer I found should interest calibration service delivery people as well as those who never thought about validating the Excel Normal Distribution number generator.

Another cool use of Monte Carlo

Dan Collins (formerly of Siemens Energy and Automation and consulting since March 09) has written a very fine article about adding a second sensor to critical processes. One of the reasons that I have linked to this article is that he also uses Monte Carlo simulation to back up his position. In this case, Dan maintains that the sensor pair will, in a sense, calibrate each other continuously. Also he says that if the pair is monitored properly using a control chart approach, this method will be twice as sensitive to problems in the process or the sensors compared with using a single sensor. In addition, that same statistical approach will reveal which of the two sensors is drifting or failed should that occur. His conclusion (with which I agree completely) is that until a sensor failure occurs, any further routine calibration activities should stop because they will increase not decrease the risk of a disruption to the process. Here is my review of this article.

Awesome FREE statistical software!

Since I started studying for the ASQ Quality Engineer certification, I have seen and used a lot of statistical tools in applications all the way from Excel to Mathematica. The best one I have seen so far is also FREE. The application is called PAST.It was developed for paleontologists but has many broad graphic and analytic capabilities that I think will help many other investigators. It has helped me a lot. The technical writing that supports each tool is unusually clear and elegant. Having tried writing on these topics myself, I know how difficult it is to do it this well. The site is well maintained even as it passes it's tenth birthday. It is very simple to upload Excel data into its spreadsheet format. It is great for exploring and learning about your data. I cannot recommend it enough.

Using Principal Component Analysis for data and variable reduction

One of the drawbacks to collecting cheap data is that you can easily collect everything in sight. The critical problem remains: how much of what you have collected is responsible for the phenomenon that you are studying? How many of the variables that are within sight are worthy of further work? How many are of them even independent of each other for that matter? Principal Components Analysis can be of great assistance (PAST features it and makes it easy to try.) I have written a description of PCA and how it has helped me in my work.

Why and how to use Mahalanobis Distance

I have never read a full biography of the Indian statistician P.C. Mahalanobis, but he must have had a very interesting life. What impressed me the most in what I have read about this man was his journey from India to England in the 1930's to visit Ronald Fishers experimental agriculture research station in England. If its true that Fisher was in the midst of becoming the Father of Modern Statistics as some say, Rothamsted Experimental Station was where he did it. A decade later George Box visited him during WWII and married one of his daughters but I am getting ahead of myself.

P.C. Mahalanobis lent his name to a statistical measure called Mahalanobis Distance. The word Distance means statistical distance (in this case, measured in units of Standard Deviation). MD can be used to measure the distance between a known set and some new proposed members of the set. Do they belong together or don't they? I first became aware of it while looking around for ways to tackle a tough problem that I describe on this sites' platform tab. I first encountered MD in Gerald van Belles' really excellent Statistical Rules of Thumb. Here is my explanation of MD. Neither of these presentations say exactly how to do it, step by step. Dr. Graeme Senior at the University of South Australia has done just that AND he explains how to do it using Excel. More than that he was kind enough to take me over the hurdles when I wrote asking for help. Here is his paper. The website forThermo Scientific recently has had several papers that were specially focused on explaining statistical tools as they applied to spectrographic analysis. They are well written and useful for providing more education on how the tools work. They hopefully will reappear on the Thermo site after some renovations occur. In particular there is an excellent presentation on Mahalanobis Distance as it applies to spectroscopy. In addition they show how to use Principal Component Analysis for outlier detection. The Thermo site also has a paper that explains how and why to applyPrincipal Component Regression. I have found that I have to re-read these kinds of sources over several days or weeks in order for them to sink in more fully. Check these out, they can help you in you search for tools!

Reliability Centered Maintenance(RCM)

While I was with Chiron I got certified as a Quality Engineer through the American Society for Quality (ASQ). The study materials themselves pointed towards RCM as have many sources That I have found since. One critical point in the historical development of RCM and its relatives such as Condition Based Monitoring was the response of the airline industry in the 1970's to the roll-out of the Boeing 747. Essentially the FAA saidwell, you are going to triple your passenger load per flight, so that implies that you should triple your maintenance hours per flight. From our current perspective, we can see that there was not data to support this assertion, but at the time, that meant that there was no defense or counter argument. This lead to the famous Knowland and Heap study of the way that parts really do wear out on a jet plane. This ground breaking study advanced the maintenance approach past the concepts of the bathtub curveand paved the way for a modern approach that allows jets to keep flying with pretty spectacular reliability. I remain very interested in applying some of those concepts within the Pharmaceutical industry. The first thing that I published in this direction was a 2006 article in a newsletter put out by the Society for Maintenance and Reliability Professionals (SMRP). It summed up what steps I thought would be necessary to predict refrigeration system failure . Here is a Power Point version of how I thought we might apply this within the Pharma Industry. Mike Sondalini is another generous Aussie with whom I have crossed paths. Here is an article to which I am proud to link in which Mike applies Weibull Analysis to a mining operations desire to reduce truck tire wear. You would too if you were paying $150,000(AU) per tire! One important value that RCM should bring to the table is energy costs savings. With this in mind, I applied for a Novartis Energy Award. My concept was that not only are unhealthy freezers headed for failure and possible product loss, but before they arrive at this end-point, they could cost thousands of dollars per unit just because they are becoming progressively more inefficient.

Process Analysis Technology (PAT)

I discovered PAT in the early days of the freezer research covered in the Platform tab of this site. At first I was very excited and energized. I was later disappointed to find that there was not enough forward thinking at my company to have grabbed this initiative and run with it. One of my QA colleagues said: Thank God that it's just a Guidance! Later on it became apparent that this reaction was industry wide and even included parts of the agency that authored it in the first place. I made this discovery after calling PAT leaders within the agency and discussing it with them over the phone. I wrote this paper on the topic. I continue to believe that a PAT-like approach is the only path to reversing the exporting of jobs in the Pharma industry. I don't think very many people agree with me however.

The Joy of Wavelets

One of the most challenging but fascinating discoveries that I have made in the research and learning I have done in my career in the industry has been wavelets. Wavelets are mathematical objects (in the formal sense) that have an extremely broad range of uses and a fascinating history that extends all the way back to Joseph Fourier. As required by any good tale, wavelets were discovered by Alfred Haar in 1909 and then immediately lost for another 65 years. In common with the PCA tool that I have mentioned above, they have a great value in data compression and analysis. They were very helpful in the study of my freezer data because it is composed of time series data. In a way, time series have two dimensions. One dimension is the data value of the set element and the other is that values' serial position within the dataset. Early statistical approaches tended toward attempts to summarize a dataset with a summary value. This sacrifices the position dimension which is sometimes just fine and sometimes not good at all. I wrote Wavelets For Innovators with an eye toward a technical audience such as might be found in the International Society of Pharmaceutical Engineers (ISPE).


To start exploring wavelets I recommend this article as a great place to begin learning about the extremely broad applications possible with this mathematical tool. It has been cited about a million times, so check it out and you will probably see why. Another helpful and excellent resource is "Amara's" website which also has great links. Amara Graps is an astronomer and an example of a discipline which has grasped the wavelet analytical approach and really run with it.


Wavelet first timers might also do well to check out Ian Kaplan. He has created some excellent wavelet materials at his Bear Cave website.

Valid XHTML 1.0 Transitional

© All Rights Reserved to Stephen Puryear for all information on this website

Template Design by: Template World | ©geniusweb