Interview Questions for Business Analysts and Systems Analysts


Recent Interview Questions | Search | Subscribe (RSS)

?
INTERVIEW QUESTION:

Describe Artificial Intelligence and how it might impact the Business Analysis profession?

Posted by Chris Adams

Article Rating // 17599 Views // 3 Additional Answers & Comments

Categories: Business Analysis, Systems Analysis, General

ANSWER

Artificial intelligence (AI) is an overarching term used to describe how computers are programmed to exhibit human-like intelligence such as problem solving and learning.  This definition of AI is broad and non-specific which is part of the reason why the scope of AI can sometimes be confusing.  As machines become increasingly capable of performing "intelligent" tasks, those tasks slowly become commonplace and as such are removed from the scope of what is generally accepted as artificial intelligence. This is known as the AI effect.  A more precise definition might be any device that takes in information from it's environment and acts on it to maximize the chance of achieving its goal.  

Imagine a computer program that accepts loan applicant information, applies several complex decisioning rules, and determines whether to approve the applicant for a loan based upon the probability of default.  This is a form of AI, or at least it used to be.  But most of us probably no longer find this type of behavior complex enough to rise to the level of AI.  There is a saying that goes "AI is whatever hasn't been done yet".

The spectrum of artificial intelligence runs from narrow AI to general AI.  Determining whether to approve a loan applicant is narrow AI. It's a program built with very specific rules to solve a very specific problem.  General AI is on the other end of the spectrum.  It's what people think about when they imagine a fully independent and reasoning superhuman-like machine.

Two rapidly expanding areas of AI are machine learning and deep learning.  They are best described as techniques for achieving artificial intelligence and are driving massive and accelerating progress in the field.  You can no longer speak about AI without mentioning them.

Machine learning is an approach that goes beyond programming a computer to exhibit "smart" behavior. Machine learning programs learn from the environment and improve their performance over time. Most machine learning techniques require the programmer to examine the dataset ahead of time and identify the important features.  Features are attributes of the data that best correlate to successfully predicting the desired output. For example, a credit score is likely an important feature of the loan applicant dataset when determining the risk of loan default.  The programmer then determines the best models for the machine learning program to apply to the features such that the error rate of predicted outputs is minimized.  It's important to understand that a machine learning program must be trained.  Hundreds or thousands of well defined data records need to be fed into the program so the predictive model can refine itself over time.  With each record it learns to more accurately predict outputs when given a new input.

Another popular AI technique, which is a subset of machine learning itself, is deep learning.  Just like machine learning, deep learning programs learn and improve their performance over time.  Deep learning programs get their name due to the "deep" multi-layered neural networks they use to learn and predict outcomes. Much like the structure of the human brain, neural networks are made up of many nodes (like neurons) that receive inputs, perform a function, and pass the result on to another node.  By chaining many nodes together in a web-like or tree-like structure complex decisioning can be achieved.  Unlike other types of machine learning programs, deep learning neural nets do NOT require the programmer to pre-identify the important features of the data. They are capable of automatically extracting the data features that are most influential to creating successful predictive outputs.  Deep learning programs require substantial computing power and massive amounts of data to be trained.  

So what does all of this mean for the business analyst?  According to a recent Oxford University study by the year 2035 nearly 47% of all jobs will be replaced by artificial intelligence. Will business analysts be one of them? This number is a frightening projection indeed, but let's put this into perspective.  First, people in their mid-40s have much less to worry about since they will likely be approaching retirement. For those who are younger, two decades is a lot of time to adapt and focus on continuing education and retraining as needed. Keep in mind that many jobs won't disappear with a single AI advancement.  Instead various aspects of a job will slowly be replaced by AI over time.  

Baidu chief scientist, Coursera co-founder, and Stanford adjunct professor Andrew Ng is a respected leader in the AI field.  During a speech at Stanford he addressed what he saw as some of the more immediate ways that business analysts and product managers will need to evolve as they support AI projects. Traditional applications tend to get their information through keyboard inputs, mouse clicks, and input files in text form.  But AI programs typically require vastly larger quantities of data to be successful and, therefore, get their information in alternative formats such as voice streams, video, photographs and much of it in realtime.  There isn't yet consensus around how to best define and communicate requirements around these types of sources. This is perhaps the first and most immediate opportunity for business analysts, adapting our role to AI projects.  One thing is for certain, the safest place to be when AI starts wiping out jobs is working in AI.

--
Chris Adams
LinkedIn Profile

RATE THIS TOPIC

ADDITIONAL ANSWERS / COMMENTS

raviraj posted on Wednesday, September 27, 2017 9:49 AM
Very good explanation...liked it!!
raviraj
Mohammad Rafi posted on Sunday, October 1, 2017 9:45 PM
Good explanation, pls don't stop sharing...
Mohammad Rafi
Richard posted on Thursday, November 9, 2017 2:40 PM
I learned about and used neural networks and AI decades ago with the same hype and predictions. In spite of all the hype, computers are cognitive morons. Besides, most of the data fed into these entertaining algorithms is of poor quality. Its like trying to turn lead into gold using data. But the loss of jobs is real.

Also affected will be "middle management" jobs. Those who currently are "data pushers", taking data from a database and putting data into spreadsheets and vice versa. This can all be automated without AI or neural networks. Even a robot can do this.

There will come a time in the not too distinct future when organizations realize that data is not as valuable an "asset" as claimed and will begin to scrutinize all expenditures relating to dressing up data.

The infortainment value of data will diminish when the real results are examined closely. AI will become another silver bullet and data lakes will be replaced by galactic data.

In the meantime revel in the glow of the data but don't count on a long career if your job has much to do with data. Find yourself a job building apps. No matter how mundane and useless, apps sell.

AI and neural networks is so yesterday except for those millennials who don't read history.
Richard
Only registered users may post comments.

Do your homework prior to the business analysis interview!

Having an idea of the type of questions you might be asked during a business analyst interview will not only give you confidence but it will also help you to formulate your thoughts and to be better prepared to answer the interview questions you might get during the interview for a business analyst position.  Of course, just memorizing a list of business analyst interview questions will not make you a great business analyst but it might just help you get that next job.

 



Upcoming Live Webinars

 




Select ModernAnalyst Content

Register | Login

Copyright 2006-2024 by Modern Analyst Media LLC