Skip to main content

IBM had a message that still resonates today: “Machines should work, people should think.”


You can see a
short excerpt of this message in “Paperwork Explosion,” a short film directed by Jim Henson of Muppets fame, produced in 1967.

Play Video

Paperwork Explosion

Wayne Levin is the President and CEO at Predictum. He has been leading the charge of the Predictum team for more than 25 years.

Fast forward to today. While paperwork has lessened, we face the modern-day equivalent: data work. Engineers, scientists, and analysts spend a whopping 80% of their time just preparing data, leaving only 20% for analysis and insights.

That’s 80% working, 20% thinking. 

Do you pay your brightest minds to prepare data? Here are a few data analysis bottlenecks you can automate!  

1. Producing Reports

From a side view, the photo shows two engineers sitting at desks where they each are working on computers.

Have you found yourself spending weeks or months preparing standardized reports? This massive time sink is all too common. Automated reporting helps engineers, scientists and analysts streamline the creation of reports.

Extract data from your files, build control charts and graphs, then upload information to standardized word documents and presentations, which you can automatically email.

At Predictum, we regularly automate annual product reports (APRs), quality control reports, and other reports, using JMP software. With automation, you will hear statements from your teams such as: 

  • “What used to take 4 hours now takes 12 minutes – error-free!”
    (Consumer Products)
  • “We used to spend 3 hours on reliability reports for each piece of equipment. With automated reporting, that dropped to 45 minutes. The standardization that comes with automation eliminated the baffling.” (Vice President, Heavy Industry)
  • “Each of our reports took 20 hours each, and we submit 20 reports annually. That number is now closer to 2 hours each.” (Pharmaceutical Company)

Automation removes the work of preparation, so you can generate insights. For more information, check out our recent series: “Automated Reporting in Biotechnology and Pharma.” Even if you don’t work in these fields, this webinar is worth a serious watch. The technology is applicable in all industries.

2. Data Transfer from Real-Time Data Repositories

Transferring data from an organization-wide database or data warehouse can be daunting.

Predictum has helped clients who use JMP to transfer data from OSI PI, a system that collects and stores decades of high-quality, real-time data from virtually any source.

The solution is a JMP/PI integration, developed using JMP Scripting Language (JSL). This integration features a graphical interface where users can bring data from PI directly into JMP. With a few clicks, transfer data and unlock insights.

Best of all? This JMP/PI integration is open-source. 
 
Download the integration here. Encounter roadblocks? Contact our team for help implementing this tool in your organization.

3. Data Extraction and Standardization from Quality Control Reports

Woman sitting at brightly-lit room at a desk in a room with many electrical devices. She is working on a laptop computer.

Do your quality control teams generate scattered data and reports? One of our clients faced this challenge.

Their teams generated text files from quality control processes. The files contained reports which helped analysts evaluate process stability and quality. But manually transposing this data from many reports to a standard, data table format was not an effective use of time. 

Predictum’s custom automation helped extract data from the text file reports, eliminating manual work and human error. 

Check out our Customer Success! Electronic Devices.

I’m considering automation. How do I justify this proposal?  


Most
custom automations built in JMP by teams like Predictum’s DevOps team are affordable, easy to scale, easy to validate, and can be enhanced and maintained for years.
 
 
We recommend you start your justification by learning what frustrates your staff most in their work, as these frustrations are markers for powerful opportunities. Endeavor to learn: 

  • What are the most time-consuming tasks in their workloads?
  • Are these tasks repetitive or require manual intervention?
  • Where do errors or inconsistencies in the data analysis process seem to crop up?
  • Are there any opportunities to standardize or optimize workflows?
  • Could automating these tasks free up time for more complex analysis and decision-making? 

Once you have a case for time-savings and cost savings, determine if you have the expertise to build the automations you need in-house with a JSL proficient staff member, or if a team like Predictum’s JSL DevOps team can take care of the time, frustration, and maintenance your automation objective requires. 

Want help justifying your requirements? Schedule a call with our developer team! We can help you determine the time, resources, budget and return on investment you can expect from automation. 

Until then, we hope the resources above will help you accomplish stronger products, processes, and efficiency.  

Optimize onward!

For over 25 years, Predictum has enabled companies to achieve higher levels of productivity, operational improvement and innovation, and realize significant savings in cost, materials, and time. Our team of engineers, data scientists, statisticians, and programmers leverages deep expertise across various industries to provide our clients with unique solutions and services that transform data into insightful discoveries in engineering, science, and research. To get in touch with our team, visit www.predictum.com/contact.

Subscribe
to the Mailing List

and stay connected to our news

    More Articles from Predictum

    Analytical Systems and ToolsBlog
    August 30, 2023

    Client Success! ADM Unlocks the Power of PI/JMP Integration with Web APIs!

    If you work with real-time data or manage complex operations, you may have encountered OSIsoft’s PI System. The PI System collects and stores decades of high-quality, real-time data from virtually any…
    From a side view, the photo shows two engineers sitting at desks where they each are working on computers.BlogNewsStatistical Techniques
    May 9, 2023

    How to Estimate Factor Ranges that Deliver In Spec Product

    R&D researchers and manufacturers desire to know the factor ranges in their products and processes that will deliver in specification product. A prerequisite to calculating acceptable ranges is to have…
    An exterior photo of FDA office on a sunny day, featuring an FDA welcome sign.Analytical Systems and ToolsBlogKnowledge Discovery
    April 28, 2023

    Why Do IND Applications Fail?

    Drug discovery research is one of the more high-risk ventures of the twenty-first century. Millions of U.S. dollars that are invested in materials, employees, and clinical designs can be made or…
    Man working at a computer with the sun hitting his face.Analytical Systems and ToolsBlog
    March 24, 2023

    Three Data Analysis Bottlenecks Your Teams Should Automate

    IBM had a message that still resonates today: “Machines should work, people should think.” You can see a short excerpt of this message in “Paperwork Explosion,” a short film directed by…
    A researcher in a laboratory looks into a microscope.BlogStatistical Techniques
    March 10, 2023

    How Many Design Points Do I Need for My DOE?

    How Many Design Points Do I Need for My DOE? This is an often-asked question.  In the traditional DOE setting, the required number of design points is driven by the…
    A bald eagle glides over a hilly winter countryside.BlogNews
    December 30, 2022

    Reflections on 2022

    Looking back on 2022 is like looking back on the past, 3 years since the pandemic began. And there is, for me at least, one major lesson: the need to…