CONSTRUCTION & REAL ESTATE
Perspective of looking up a stairway to the outside.
Discover how crafting a robust AI data strategy identifies high-value opportunities. Learn how Ryan Companies used AI to enhance efficiency and innovation.
Read the Case Study ⇢ 

 

    LEGAL SERVICES
    Person looking out airplane window wearing headphones
    Discover how a global law firm uses intelligent automation to enhance client services. Learn how AI improves efficiency, document processing, and client satisfaction.
    Read the Case Study ⇢ 

     

      HEALTHCARE
      Woman with shirt open in back exposing spine
      A startup in digital health trained a risk model to open up a robust, precise, and scalable processing pipeline so providers could move faster, and patients could move with confidence after spinal surgery. 
      Read the Case Study ⇢ 

       

        ⇲ Dive Into
        LEGAL SERVICES
        Wooden gavel on dark background
        Learn how Synaptiq helped a law firm cut down on administrative hours during a document migration project.
        Read the Case Study ⇢ 

         

          GOVERNMENT/LEGAL SERVICES
          Large white stone building with large columns
          Learn how Synaptiq helped a government law firm build an AI product to streamline client experiences.
          Read the Case Study ⇢ 

           

            strvnge-films-P_SSMIgqjY0-unsplash-2-1-1

            Mushrooms, Goats, and Machine Learning: What do they all have in common? You may never know unless you get started exploring the fundamentals of Machine Learning with Dr. Tim Oates, Synaptiq's Chief Data Scientist. You can read and visualize his new book in Python, tinker with inputs, and practice machine learning techniques for free. 

            Start Chapter 1 Now ⇢ 

             

              ⇲ Artificial Intelligence Quotient

              How Should My Company Prioritize AIQ™ Capabilities?

               

                 

                 

                 

                Start With Your AIQ Score

                  5 min read

                  Traditional Business Rules vs. Predictive Software Features

                  Featured Image

                  I spent nearly half of my career writing requirements and developing and testing software the traditional way.  Although I was a strong advocate for iterative software development practices before Agile was “the thing”, it took me a while to fully comprehend and appreciate the mindshift required to move from developing traditional business rules to predictive features in software.

                  Traditional software development that incorporates business rules follows a well-known series of high level steps:


                  → DISCOVERY: Document the problem that needs a solution

                  → REQUIREMENTS & DESIGN: Draft requirements and compile the solution design

                  → DEVELOP: Build the software that meets the requirements and design

                  → TEST: Assess whether the software actually meets the requirements and design specifications

                  → BUG-FIX: Address software bugs and requirements and design discrepancies

                  → LAUNCH: Migrate software to a production environment for operations management


                  Developing predictive features are different.  You don’t know what is possible until you complete at least one round of experimentation by conducting a feasibility study.


                  → DISCOVERY: Document the problem that needs a solution

                  → FEASIBILITY STUDY: Train a model or use an existing model to determine if it addresses the problem:

                  If the model output addresses the problem:

                  →MODEL DEPLOYMENT: Prepare and deploy the model or integrations to an existing model to production for operations management

                  →REQUIREMENTS & DESIGN: Draft requirements for the user experience that delivers the model output and compile the overarching solution design

                  →DEVELOP: Build the user experience software that meets the requirements and design

                  →TEST: Assess whether the user experience software actually meets the requirements and design specifications

                  →BUG FIX: Address user experience software bugs and requirements and design discrepancies

                  →LAUNCH: Migrate user experience software to a production environment connected to the model in production for operations management

                  If the output is poor and there isn’t enough quality data, execute processes to collect more quality data then repeat the FEASIBILITY STUDY


                  Many companies get tripped up on the FEASIBILITY STUDY and REQUIREMENTS & DESIGN steps for predictive features.  So, let’s dig into those in detail.

                  FEASIBILITY STUDIES

                  The crux of this step is to determine if you can train or use an existing model to meet the expectations of your users and solve the problem. Unfortunately, this isn’t a black or white thing.  If models produced 100% accurate results, they wouldn’t be predictive models. They would be hard-coded business rules.

                  Let’s take Amazon as an example… If Amazon’s product recommendations were really poor, users would lose trust and confidence in their service.

                  At the end of the day, you have to decide how accurate model predictions need to be for you to feel comfortable about users interacting with it

                  Assuming that the first time you train a model or use an existing model you aren’t happy with the results.  What can you do to improve them?

                  While there are many technical techniques you can deploy that I won’t go into here, it often comes down to the underlying data used to train the model.  We’ve seen this countless times across our 200+ projects.

                  Do you really have enough representative data?

                  More confident models require more representative data which incurs more human work.

                  Deeply experienced data scientists may have a “gut feel” after exploring a sample dataset; but, determining whether you really have enough representative data to meet your expectations requires experimentation.  Therefore, it’s imperative to train a model with your real data and test it before setting any significant expectations.

                  In many cases, we’ve found our clients are overly confident in their data and they tend to have high expectations on model accuracy.  After we complete the first feasibility study we oftentimes need to explain to them that there is more work to do to improve upon or expand their dataset before proceeding.

                  REQUIREMENTS & DESIGN

                  What’s the right user experience to deliver your model’s outputs?

                  Unfortunately, for most AI-first oriented organizations the user experience comes last. There’s a trained model ready in production, but the team hasn't thought about how to surface the model in a valuable user experience yet.

                  I could ramble on and on about user-centered design; but, I’ll keep it simple here.  If you are one of the many organizations that wait until the end to thoughtfully consider the user experience, be ready for rework.  Again and again, we have found that the value of a model is determined by the data that fuels it, the depth of experience of the team that trains the model, and the user experience capturing the training data and delivering its output.

                  To address this issue, I strongly recommend taking both a top down (user-centric) and bottom up approach (data-centric) to developing AI solutions.

                  The Bottom Line

                  Developing predictive features in software requires a different mindset and approach compared to traditional business rules. The key difference lies in the iterative nature of predictive features, where feasibility studies and representative data play a crucial role.

                  As innovators in your organization, you need to be prepared to experiment, collect representative data, and refine models and user experiences accordingly. By taking a user-centric and data-centric approach, your business can unlock the full potential of predictive features and create valuable solutions that meet the needs of your users.


                  ANewsletter-Header_01-01

                   

                  About Synaptiq

                  Synaptiq is an AI and data science consultancy based in Portland, Oregon. We collaborate with our clients to develop human-centered products and solutions. We uphold a strong commitment to ethics and innovation. 

                  Contact us if you have a problem to solve, a process to refine, or a question to ask.

                  You can learn more about our story through our past projects, blog, or podcast

                  Additional Reading:

                  Traditional Business Rules vs. Predictive Software Features

                  I spent nearly half of my career writing requirements and developing and testing software the traditional way....

                  The TOP DOWN, BOTTOM UP Approach Is a Must for AI Products

                  If you don’t take a TOP DOWN and BOTTOM UP approach to design user-facing AI software, you’ll suffer the consequences...

                  Is Your Head in the Sand?

                  I regularly meet executives that are pondering what to do with AI.

                  Typical responses I hear are:

                  “Clearly it’s...