Save
Upgrade to remove ads
Busy. Please wait.
Log in with Clever
or

show password
Forgot Password?

Don't have an account?  Sign up 
Sign up using Clever
or

Username is available taken
show password


Make sure to remember your password. If you forget it there is no way for StudyStack to send you a reset link. You would need to create a new account.
Your email address is only used to allow you to reset your password. See our Privacy Policy and Terms of Service.


Already a StudyStack user? Log In

Reset Password
Enter the associated with your account, and we'll email you a link to reset your password.
focusNode
Didn't know it?
click below
 
Knew it?
click below
Don't Know
Remaining cards (0)
Know
0:00
Embed Code - If you would like this activity on your web page, copy the script below and paste it into your web page.

  Normal Size     Small Size show me how

Stack #2549802

QuestionAnswer
There are 2 types of statitiscs (Analytics) descriptive & inferential
descriptive statistics are used to inform/explantory
inferiental statistics are used to predict/trend
name 4 levels of measurement (NOIR) Nominal, ordinal, interval, ratio
continuus data w/unique zero point ratio
orders data at EQUAL distance apart interval
place qualitative objects in some kind of order ordinal (order)
indentify, group or catergorize nominal (neat)
outliers create this type of error out-of-range
unpredictable error random error - no correlation
error may occur from missing data. (example: space not filled in) omission error - distorted results
this error repeats itself systematic error - skewed results
what is the purpose of the process of quality control reduce/minimize errors
experimental study all variable measurements & manipulations are under the researcher's control
observational study used with impractical or impossbile to control the conditions of the study
blind study participants are not told if they are in the treatment gorup or control group
treatments the procedure the researcher applies to each subject
double blind study neither the treatment allocator or the participants knows who is in the treatment group or control group
information bias questions favor an outcome or the interviewer asks questions that favor an outcome
expected monetary value (EMV) anaylsis the average outcome (payoff) when the future includes scenarios that may or may not happen
outliers observation points that are distrant from other observations
measurement bias bias that occurs from not selecting a random sample
conscious bias bias introducted because respondents believe it will be beneficial if selected
median - outliers do not affect the median middle score for a data set
z-score tells us the number of standard deviations a data point is from the mean
variance if the average is the same for 2 groups, what will determine their difference?
standard deviation the spread of data in a sample.
standard deviation how far the data points are from the mean
mean measure of central tendency that is influenced by the size of the values in a dataset
quartiles each of the four quartile groups a population can be divided into
IQR: Inter-quartile range must be ordered in lowest to highest value
box plot used to study the composition of a data set & determine the distribution
precentile score order: max value, 25th percentile, median, 75th percentile, min value
mean & median skewedness does not affect these
outliers Can be included or excluded in anaylsis (causes skewness)
1 customer & 6 booths = 1/6 or 16.7% There are 6 toll booths to enter the highway. What probability does ecah toll booth worker have of getting the next customer
combination the order you pick your sample in does not matter
combination Example: picking employees for a shift. Order doesn't matter
Bayes Theorem you must know P(A), P(B)
Bayes Theorem P(A) given B
Bayes Theorem When given P(A) given P(B), you can use this to find the P(B) given P(A)
multiplication apply this rule when looking for 2 events occuring (AND)
addition use this rule when looking for 1 of the other event happening (OR)
linear programming a technique for minimizing total cost or maximizing profit based on constraints
linear regression a technique using a single independent variable to predict a single dependent variable
multiple regression a technique using more than 1 independent variable to predict a single dependent variable
correlation coefficient measures the strenth of a linear relationship
R2 or R squared measures the goodness of fit in a regression anaylsis
statistical tests R2, correlation coefficient, mutliple regression, linear regression, linear programming
descriptive statistic measurements median, mean, z-score, variance, standard deviation
types of bias information, outliers, expected monetary value (EMV), measurement, conscious
types of studies experimental, observational, blind, treatments, double blind
types of errors out-of-range, random, omission, systematic
levels of measurement ratio, interval, ordinal, nominal
distribution & central tendency time series, trend, irregularity, cyclicality, seasonality
time series a simple regression using time as the independent variable
trend a general slope upward or downward over a period of time
irregularity unforseen circumsntances causing random deviations
cyclicality reptition in up and donw patterns
seasonality regular pattern within a single year
negative from left to right: arrow slating from top to bottom (downward)
positive from left to right: arrow slanting from bottom to top (upward)
strong dots are close to together
weak dots are spread apart
cumulative distribution represents the probability tha a variable falls within a certain range
probbility distribution a list of all fo the different probabiities of each outcome that can occur
z-score for 99% level of confidence 2.575
z-score for 95% level fo confidence 1.96
normal distribution measures of central tendance are approximately equal (mean & median)
ANOVA used to compare the mean of three or more groups
anaylsis techniques ANOVA, trend, irregulairty, cyclicality, seasonality
F-value must be higher than critical value to reject the null
F-value ANOVA uses this test statistics
T-value must be higher than critical value to reject the null
T-value T-test uses this test statistic
zero a correlation is weak if the coefficient is close to this
1 or -1 a correlation is strong if the coefficient is close to this
7 basic quality tools run chart, control chart, case & effect diagram, flowchart, check sheet, histogram & pareto, scatter diagram
run chart illustrates peformance measurements over a period of time
control chart illustrates limits or constratins a process should not exceed
cause & effect diaggram assists in brainstorming issues that are causign a problem
flowchart visual tool to understand a process
check sheet easy tool to collect data to create other charts
histogram & pareto graphical display of a data set with 1 bar for each cateory
pareto graphical display of data set in highest to lowest order
histogram graphical display of data set centered
scatter diagram used for potential relationships & correlation between vairables
yes can the seven basic quality tools be used independently
90%-95% what percent of quality problems does Ishikawa claim the 7 tools can solve
SIPOC supplier-input-process-output-customer
SIPOC diagram demonstrating all of the elements that can influence a process before it starts
six sigma manufacturing approach to improving processes
quality control in manufacturatin, statistics are used for this purpose
Act the step in the PDSA that is a response to analytical results
PDSA Plan-Do-Study-Act formerly PDCA (Plan-Do-Check-Act)
attribute shows whether a result meets a requirment or not
variable shows how well a result meets the requirement
common cause variation variations accepted as a normal part of the process
special cause variation variation from an abnormality causing large discrepancy in results
IRT Item response theory
IRT model of designing, anyalzing & scoring tests
goventment benefits aren't always money could be flood prevention or welfare
how does government differ from private sector cost-benefit analsysis government beneifts aren't always money
norm referenced compares one individual's performance to other individuals
criterion referenced compare individual's performance to a standard score (ex: cut score 64%)
cost-benefit analysis used to analyze if funding is worth the outcome of a project
very large data sets what is big data
prevalence used to count all of the exisint cases in a disease
incidence (incident rate) used to count only NEW cases of a disease
used in big data very large data sets, prevalence, incidence, criterion referenced, cost-benefit anaylsis
RBM Results based management
RBM management strategy that uses results as the central measurement of performance
performance measures KPI, KPI dashboard, CLIF
KPI Key performance indicators
KPI performance measure for 1 specific goal
KPI dashboard multiple KPIs are displayed for the big picture
CLIF customer learning, internal process, financial performance
What does a balance scorecard measure CLIF
What does a balance scorecard measure Are we meeting the strategy?
Disadvantage of balanced score card requires time & effort to establish a meaningful scorecard
advantage of a balanced scorecard improves internal & external communication
Disadvantage of balanced score card dificult to maintain momentum
advantage of a balanced scorecard improves organizational alignment
advantage of a balanced scorecard links strategy to organizational results
advantage of KPI data driven results make it easier to quantify performance
disadvantage of KPI difficult to change once set up
garbage in garbage out bad data produces bad results/decisions
analytics the extensive use of data, statistical & quantative analysis, explanatory & predictive models & fact-based management to drive decisions & add value
analytics emcompases statsitics or the study of data analysis
analytics emcompases management science or the study of model building, optimiation & decision making
analytics typically implies the analysis of very large data sets
analytics is turning information into insight & developing conclusive fact-based startegies to gain a competitive edge
you can't manage what you don't measure
big data refers to both structured & unstructured data in large volumes
structured data example: credit card transactions through a website
unstructured data word documents or the bodies of emails
unstructured data videos from a traffic accident sitte
unstructured data data that doesn't reside in a traditional row-colum database format
managers are constantly called upon to make decisions in order to solve problems
decision making & problem solving ongoing processes of evaluationg situations or problems, considering alternatives, making choices & following them up with necessary action
the entire decision-making process is dependent upon the right information being available to the right people at the right ime
quantitive decision making steps framing the problem, solving the problem, communicating results
framing the problem includes broad definition of business problem
framing the problem includes review what has happened in the past
framing the problem includes type of anaylsis
framing the problem includes what data & how to collect
solving the problem includes specific question to analyze
solving the problem includes type of data, data collection & data error
solving the problem includes analysis technique & analysis
communicating results includes presentation of analysis output
communicating results includes making a recommendation
analytics is turning information into insight
value and difficulty increase when going from information to optimization
value and difficulty increase when when going from hindsight to insight to foresight
hindsight descriptive analysis & diagnostic analysis
descriptive analysis what happened
insight predictive analysis
diagnostic anlaysis why did it happen
foresight prescriptive analysis
predictive analysis what will happen
prescriptive analytics how can we make it happen
descriptive/diagnostic analytics encompasses the set of techniquest that descripe what has happened in the past
descriptive/diagnostic analytics the discovery & communiation of meaningful patterns in data
descriptive/diagnostic analytics the vast majority of the stats we use fall into this category
descriptive/diagnostic analytics think basic arithmatic like sums, averages, percent changes
descriptive/diagnostic analytics utilizes graphical analysis such as scatter grams, histograms, pareto charts & run charts
descriptive/diagnostic analytics useful to show total stock in inventory, avg dollars spent per customer & year over year changes in sales
common exampes of descriptive/diagnostic analytics reports that show historical data regarding a company's production, financials, operations, sales, fiannce, inventory & customers
predictive analysis consists of techniques that use models constucted from past data to predict the future or ascertain the impact of 1 variable on another
predictive analysis uses models such as regression, time series & statistical quality control
predictive analysis uses simulation
predictive analysis simulation involes the use of probablity & stats to construct a computer model to study the impact of uncertainty on a decision
prescriptive analytics decision models that indicate the best course of action to take
prescriptive analytics uses optimization models, simulation optimization & decision analysis
optimization models models that suggest the best descision subject to constraints of the situation
simulation optimiation combines the use of probablity & stats to model uncertainty w/optimization techniques to find good decisions in highly complex & highly uncertain settings
decision analysis used to devleop an optimal strategy when a decision maker is faced with several decision alternatives & an uncertain set of future events
decision analysis employs utlity theory, which assigns values to outcomes based on the decision maker's attitutude towards risk, loss and other factors
marketing one of the fastest growing areas for the application of analytics
resulted in a better understanding of consumer behvior the use of scanner data & data generated from social media
resulted in an increased interest in markeing analystics a better understanding of consumer behavior obtained through the use of scanner data & data generated from social media
a better understanding of consumer behavior through marketing analytics leads to: the better use of advertising budgets
a better understanding of consumer behavior through marketing analytics leads to: more effective pricing strategies
a better understanding of consumer behavior through marketing analytics leads to: improved forecasting of demand
a better understanding of consumer behavior through marketing analytics leads to: improved product line management
a better understanding of consumer behavior through marketing analytics leads to: increased customer satisfaction & loyalty
levels of measurement: data breaks down into measurement and counting
levels of measurement: measurement breaks down into ratio & interval
levels of measurement: counting breaks down into ordinal & nominal
levels of measurement: ratio & inteval are numerical & continuous
levels of measure: ordinal & nominal are categorical & discrete
nominal level of measurment lowest of the 4 ways to characterize data
nominal data (naming) deals with names, categories or labels
nominal data is qualitative
examples of nominal data colors of eyes, yes or no survey response, favorite cereal
nominal data level data at this level can't be ordered in a meaingful ways
nominal data level makes no sense to calcualte things such as means & standard deviations
ordinal data places data objects into an order according to some quality
ordinal data should not be used in calcuations
nominal & ordinal data should not be used in calcuations
ordinal data (organizing) examples incude a listing of top 10 cities in which to live
nominal data naming
ordinal data organizing
interval data has an order to it & all objects are an equal interval part
interval data the difference between 2 values is meaningful
interval data does not have a natural zero point
interval data zero does not represent the absence of the property being measured
interval data example: Fareheit & celsius temperature scales. You can talk about the differences in temps, 0 does not equal the absense of temperature.
interval data this data can be used in calcuations
ratio data similar to interval data but has a unique zero point
ratio data numbers can be compared as mutliples of one another
ratio data level not only can sums & differences be calculated but also ratios
ratio data one measurement can be divided by any nonzero measurements resulting in a meaningful number
in business ratio data is common
ratio data examples include: income, stock price, amount of inventory & # of repeat customers. All have a unique zero potin & can be compared through mutliplication
data management process by which the required data is acquiared, validated, stored, protected & processed
data management process by wich the accessiblity, reliability & timeliness of data is ensured
data management includes the cleaning, organizing & storage of collected data
business decisions are not better than the data on which they are based
reliable, relevent and complete data supports organizational efficiency & is a cornerstone of sound decision-making
complete all relevant data - such as accounts, addresses & relationships for a given customer - is linked
accurate common data problems like misspellings, typos, and random abbreviations have been cleaned up
available required data is accessible on demand; users do not need to search manually for the info
timely up-to-date informaiton is readily available to support decision
all measurements contain some degree of error
random error in experiemental measurements are cuased by unknown & unpredictable changes in the experiement
random error these changes may occur in the measruing instruments or in environmental conditions
random error should cancel themselves out over a large number of measurements
systematic error systematic errors in experiemental observations usually come form the measuring instruments or experiemnental design
systematic error measument errors that are constant within a data set
omission error occurs when action has not beeen taken or when something has been left out
skewness bias measue of the degree to which data "leans" towards one side
measurement bias prejudice in data that represents when the sample is not representative of the actual sample population
to produce unbiased results the sample tested must be sufficiently random
information bias prejudice in data tha results when either the respondent or interviewer has an agenda, is not impartial or truly honest
response bias respondants say what they believe the questioner wants to hear
response bias can occur as a result of the way a question is worded
conscious bias occurs when the surveyor is actively seeking a certain reponse
consequences of sub-par data quality wasted money, bad or delayed decisions, mistrust
wasted money 16%-18% of business budgets are eaten up by poor data quality
bad or delayed decision if you suspect you're dealing with unreliable or incomplete data, you might delay making a decision. If you don't suspsect your decision may well be wrong
mistrust poor quality data often breeds this among internal departments or exernally with customers thus leading to lost production and/or sales
qualitative data data not characterized by numbers but rather are texutal, visual or oral
qualitative data focused on stories, visual portrayals, meaningful characterizations, interpretations & other expressive descriptions
quantitive data quantity or measurement
quantitive data represents phenomena by assigning numbers in an ordered and meaningful way
qualitative research primarly exploratory reasearch
qualitative research used to gain an undrestanding of underlying reasons, opinions & motiviations
qualitative research provides insights into the problem or helps to develop ideas or hypothesis for potential quantitiave research
observational study used beuase it is impractical or impossible to control the conditions of the sutdy
observational study response variables can be observed within their natural environment, giving the sense that they haven't been artificially constrained
observational study generally considered weaker in terms of statisitical inference
quantitive research used to quantify the problem by generating numerical data that can be transformed into useable stats
elements of experimental study experimental units, treatments & responses
experimental units the subjects or objects under obseration
treatments the procedures applied to each subject
responses the effects of the experimental treatments
common purpose: qualitative research discover ideas, used in exploratory research w/general research objects
common purpose: quantitive research test hypotheses or specific research questions
approach: qualitative research observe & interpret
approach: quantitative reaearch measure & test
data collection approach: qualitative unstructured, free-form
data collection approach: quantitative structured response, categories provided
researcher independence: qualitative researcher is intimately involved, results are subjective
researcher independence: quantitive researcher is uninvolved observer, results are objective
samples: qualitiatve research small samples - often in natural settings
samples: quantitive research large samples ot produce generalizable results (results that apply to other situations)
most often used: qualititave reasearch exploratory research designs
most often used: quantitative reasearch descriptive & causal research designs
big data has very little use if we don't use tools to make it manable and applicable in some way to help make business decisions
big data usually requires the use of a computer to manage
interval data can not be mutliplied or divided
interval data examples include time, date, temps
ratio data all analysis techniques can be used with this data type
example of descriptive type of analysis median household income in new york is $58,000
example of predictive type of analysis alternating 2 EMT teams between 5p & 10p will improve emergency response time
example of prescriptive type of analysis employee turn over for an organization will be 5%
example of nominal data types of cars produced in a factory
example of ratio data distance between LA & san francisco because it can be less than whole miles
example of ordinal data age of survey participants because it's putting it on a scale
example of ratio data total sales for the day in dollars because it can be less than whole dollars
systematic error scale that doesn’t return to 0 after each weigh
use of data in decision making can lead to more effective and accurate decisions/recommendations
use of data in decision making proves if samples are statisitically the same or different
samples must be drawn randomly to increase likelihood of sample representing population of interest
example of nonrandom sample asking friends their opinion of a new product
situation which may create a response bias a survey of employees conducted by their supervisor on job satisfaction
conscious bias example a marking manager conducting a focus group on levi's mens suits asking "Wouldn't you like to see a levi's tab on your business suit?"
missing data or refusals everyone is not represented & skews average satisfaction scores
small sample sizes assumptions made might not be realistic
wrong tool leads to poorer results.
lack of blinding knowledge of which group is receiving which treatment
associating causality example if an increase in stork's nest occurs in a village where there is an inrease in childbirth
example of when to use a control chart manufacturing faric to see if the color of each bolt is consistent and in control
example of when to use trend analysis marketing research to see if TV advertising is effective over timer
example of when to use regression finding cause & effect of dietary changes on health
example of when to use linear programming calculating the optimal path for a plane to fly
example of when to use ANOVA (3 or more comparison groups) comparing outcomes of students' ACT scores form each state
example of when to use t-test comparing airline satisfaction between business & vacation travellers
example of when to use chi square (categorical data) determining if women statistically prefer watching crime series dramas more than men
example of when to use crossover analysis deciding if automation is less expensive in the long-term over manual processes
select variables that are observeable & measurable
control other independent variables such as when testing out a new pizza variety, if some respondants are hungry, they will skew results
identify all assumptions made in the study
check to see if any biases existed in the research
look for other research in this area to build on their concepts & to see if results between studies are consistent or not (and understand why or why not)
cite and utilize credible sources
mean mathmetical average
median value in the center of the data range (freeway median)
mode most frequent occurance
when mode, median & mean are the same number a true normal distribution has occurred
mean most sophisticated of the 3 measures of central tendance
measures of central tendency mean, mode & median
unlinke median & mode, mean is influenced by the size of the values in the dataset
extreme values or outliers have a greater influence on mean than they do on median or mode
mean can only be calculated for interval or ratio data
x-bar symbol for the sample mean
symbole for population mean is a weird looking u
median the point at which an equal number of scores fall above & below
median the half-way point of the data
median an equal number of values in a distribution are greater than the median & less than the median
median calcuated by taking the average of numbers if you have 2 data points
median if you have an even number of values, this is the halfway point between the 2 middle data points
mode least sophicsticated of the 3 central tendancy measures
mode single score or value occuring most often
mode there can be more than one of these
mode can't be used in inferential statistics
mode is not a mathmatical computation
excel median MEDIAN(b2:b16)
excel mode MODE(B2:B16)
excel mean AVERAGE (b2:b16)
variance formula uses weird 0 squared
standard deviation formula uses regular 0 and square root of formula
interquartile range measures the difference between the 3rd & 1st quartile
to determine the quartiles order the data from lowest to highest then separate it into 4 equal groups
excel quartiles sort quartiles form lowest to highest then use quartile command (ex: for quartile 1 QUARTILE(a:2:a13,1)
box plot also known as a box and whiskers or hinge plot
box plot useful when showing non-parametric data as it takes into account median and percentiles rathe than averages
line graph plots the relationship between 2 or more variables by using connected data points
line graph very usueful when there is time series data to be summarized
line graph appropriate where the data is continuous
vertical bars in a histogram show the counts or numbers in each range
histogram graph that displays continous data
a comparison of the ranges on a histogram helps the audience understand the information presented
bar chart summarizes categorical data
bar chart data uses a number of bars with the same with, each representing a particular category
histogram measures how continuous data is distributed over various ranges
bar chart measures data that is distributed over groups or categories.
histogram usueful in identifying a distribution of data
bar charts more effective for understanding, weighting or quantiy of items by category
the wider the histograph distribution the greater the variance of the data
historgram distributions can have the same mean but different variances
histogram the shape of the graph illustrates how the data is distributed
scatter diagram: if a correlation doesn't exists the data points on the diagram line up along a curve or straight line across the chart
scatter diagram: if a correlation exist the data does not fall along a curve or a line
higher R score the more highly correlated the data
R value close to 1 data is highly correlated
R value close to 0 data is not correlated
scatter diagrams or plots tools that look for relationships between 2 variables
scatter diagrams or plots not statistically significant such as usage of regression
T-tests: paired sample same unit/person before & after treatment, looking for a significant difference
example paired sample t-test blood pressure of 1 subject is taken before and after 30 minutes of harp playing to see if there is a difference
1 sample t-test comparing sample mean to a target
example of a 1 sample t-test fire chief compares time it takes firement to get into their truck after bell rings against a 2 min standard.
independent sample t-test AKA 2 sample t-test or student's t-test, comparing mutliple sample groups
example of independent sample t-test department head of PT/OT compares hours of appointments per FTE between groups to see if there is a difference in productivity
independent sample t-test equal number of participants in each sample is NOT required
paired sample t-test equal number of participants in each sample is IS required
independent sample t-test as a rule of thumb 15 or more participants should be in each group
chi square applied when you have 1 or more categorical variables from a single pouplation
chi square used to determine whether there is a significant association between the 2 variables.
use chi square in an election survey, voters might be classified both by gender and voting party
use chi square to determine whether gender to related to voting preference
chi square is basically just a table
one way chi square only 1 dimension/variable is referenced
null hypothesis the number noted should be approximately the same across all time periods
standard deviation is the square root of the variance
mean can be effected by outliers
interquartile range not affected by outliers
1st & 4th quartlies where ouliers would be if any
box plot line is outer or 1st & 4th quartiles
box plot boxes are inner quartiles
scatter plot: x axis (horizontal - sex) independent variable
scatter plot: y axis (vertical) dependent variable or the result or prediction of how it will be affected by the other axis
scatter plot best way to confirm relationship is to run a regression on the points and determine the p value
p value less than .05 there IS a significate relationship between factors
p value more than .05 there is NOT a significate relationship between factors
paired sample t-test looks at the same people before AND after a treatement, looking for a treatment effect
linear regression determining if there is a statistically significant relationship between a target (dependent) variable & 1 or more predictor (independent) variables
target variable dependent variable
predictor variable independent variable
linear trend analysis determing if there is a trend over time
multiple regression determing if mutliple variables have a significatn relationship with dependent variable
probability the chance of an event occuring at some point in the future
probability involves chance & outcomes and can be based on independence or associations
probability is between 0% & 100%, between 0 and 1
probability helps to determine odds of future outcomes occuring
probability helps in the assessment of what decisions to make based on likelihood of future trends
probability whas it the risk of doing nothing? What is the chance of failure? How likely are we in the achievement of our market share goals?
probability Can assist in risk maangement. What is the chance of a highly adverse outcome occuring?
probability common examples include: a flip of a coin, rolling a die or drawing a certain card from a deck.
probability numer of possiblities that meet conditions/ number of equally likely possiblities
probability combination principle total outcome for several events = individual's # times # of possible outcomes for each additional event EX. 3 dice = 6 *6*6 or 216 possible outcomes
probability sample with replacement an item is removed from the group and then added back. Ex: trying on clohting then putting it back on the rack
probability sample without replacment an item is removed but not put back. Ex: lottery ball kept out when drawn
probability with overlap make the interlocking circle diagram - add items in circle and subtract # in overlap
measures of dispersion range, variance & standard deviation
variance measure of how spread out data are about the mean
standard deviation measure of how far, on average, the data points are from the mean.
on mean/standard deviation chart each verticle line represents an incremental standard deviation from the mean
z-score score relating to confidence, is equivelent to one standard deviation from the mean.
z = plus or minus 0.674 50% confidence
z = plus or minus 1.645 90% confidence
z = plus or minus 1.960 95% confidence
z = pluse or mins 2.576 99% confidence
normal distribution graph shows mean & standard deviations (ghost with lines through it)
normal distribution graph mean of the distribution determins the location of the center of the graph
large standard deviation & variant short and fat ghost
small standard deviation & variant tall and skinny ghost
normal distribution graph area under the curve represents the probability of events happening
normal distribution graph total area under the normal curve, as with any distribution is 1
cumulative probability sum of probabilities
in relation to normal distribution, a cumulative probability refers to the probability that a randomly selected score will be less than or equal to a specified value
z-score statistical measurement of a score's relationship so the mean
z-score of 0 the score is the same as the mean
z-score can be positive or negative, indicating wither it is above or below the mean & by how many standard deviations
linear trend analysis independent variable or x is always times
linear regression analysis x value is never timed, is a different variable
alternate hypothesis there IS a significate relationship between factors
null hypothesis there is NOT a significate relationship between factors
if P value for the independent value is less than .05 you reject the null hypothesis & accept alternate hypothesis
addition rule of probability question asks for probability of 1 OR another event occuring. Add the chances together
mutually exclusive events that CAN NOT occur at the same time. Ex. Rolling a 1 AND 6 on 1 die
addition rule of probability if more than 1 chance, add all of the chances together
Bayes Theorem conditional probability
Bayes Theorem "given that" usually used in the problem
Bayes Theorem: P(spots) P(Dog) * P(Spots/Dog) + P(Cat) * P(spots/cat) =
normal distribution graph 2/3 or 62% of the points are 1Z or 1 standard deviation below the mean & 1 above the mean
normal distribution graph the 1st deviations are the 2 on either side of the mean
normal distribution graph - 1Z one place to the left of the mean
normal distribution graph + 1Z one place to the right of the mean
250 bank customers with 8.2 min avg wait time, 1.2 standard deviation, 94% prob customer waiting want to be 2 standard deviations from the mean. Multiply standard deviations by number given. 1.2 *2. Add to mean for positive and subtract from mean for neg
tools used to analyze business linear programming, breakeven & crossover analysis, normal distribution & ANOVA, forecasting, cluster analysis, decision analysis, chi-square
objective of all linear programming problems the maximization or minimization of some quantity
all linar programming problems have constraints that limit the degree to which the objective can be pursued
a feasible solution satisfies all the problem's constraints
an optimal solution a feasbile solution that results in the largest possible objectie fucntion value when maximizing (or smallest when minimizing)
a graphical solution method can be used to solve a linear program with two variables
linear programming problem both the objective function & the contraints are linear
linear functions functions in which each variable appears in a separate term raised to the first power and is multiplied by a constant (which could be 0)
Linear constraints are linear functions that are restricted to be "less than or equal to", "equal to", or "greater than or equal to" a constant.
linear programming problem feasible region is the region below both lines, farthest to the bottom left
linear programing extreme points points located on the line that surrounds the feasible region,best on corners
linear programming analysis helps with mutliple variables (items) with mutliple constraints
linear programming we can consider far more than 2 items
linear programming we use software that does the math
linear programming very effective way to find the best solution for mutliple products w/mutliple limitations to reach a "best" goal
breakeven & crossover analysis analyze trends to determine when 2 characteristics are equal
breakeven & crossover analysis consider when profits will be at a break even point? At what volume will revenues & costs be equal?
breakeven & crossover analysis At what volumes are 2 approaches equal? At what volume are total costs equal & when is one better than the other?
breakeven graph where revenue & volume cross each other is your breakeven
breakeven & crossover analysis used to analyze trends to determine when 2 characteristics are equal
breakeven usually used with profits & therefore includes revenues & costs
crossover usually looks at various options to determine when a particular option is most attractive & when another option is better
normal distribution graph the higher the curve at any point, the more likely that event will happen
mean (average) most liekly outcome of the "bell curve"
standard deviation the distance around the mean where 2/3 of events happen
standard eviation determines (describes) the height & width of the normal deviation graph
large standard deviation & variant process is highly variable, wide range of things happen on a routine basis
small standard deviation & variant process is highly uniform, almost always happens
standard normal distribution has a mean of 0 & a standard deviation of 1
standard normal distribution all results are above or below the mean or ZERO
average temp in oct is 65, standard deviation of 10 mean = 65, -1 standard devation = 55 & 1 standard deviation = 75
area under the curve on normal distribution graph represents the probability of events happening
area under the curve on normal distribution graph total area under the nomal curve, as with any distribution, is 1 or 100%
the emperical rule: normal curve probabilities approximately 2/3 of data points within a dataset will be within 1 standard deiation of the mean
approximately 95% of all data poitns will be within 2 standard deviations of the mean
Almost all (99%) o the data points will be within 3 standard deviations of the mean
normal distribution graph or normal curve
normal distribution graph mean nearly equal to median
multip modal distribution mutliple hills
bimodal distribution 2 hills
normal distrubtion 1 hill
pareto one smooth skate ramp and others with sharp points
ANOVA statistical analysis to determine whether separate data sets are different form each other or whether they are really too similar to be different
ANOVA null hypothesis AKA straw model or basic assumption
ANOVA null hypothesis all mean populations are equal
ANOVA alternate hypothesis not all of the population means are equal but at aleast one are significantly different
ANOVA F-value test statistic ussed to set the region in which we can reject the null hypothesis & show that the alternative hypothesis is accepted
if the ANOVA null hypothesis is false F-value is likely to lie in the region
if the ANOVA null hypothesis is false at least one of the population means is significantly different from others, subject to the stated level of significance
linear programming graph best answers one of the corners along the feasible area parimeter
to find the best of the best answers from linear programming graph plug coordinates at corners of feasible into problem & select one that gives me highest number/profit
crossover analysis only considers cost not revenue
breakeven analysis considers both cost & revenue
crossover graph: crossover point where 2 criteria are equal or cross
six sigma don't want to see any events occuring more than + or - 6 standard deviations due to extremely rare or unlikelines of occurance
standard curve usually no more than + or - 3 standard deviations
culumlative probability on graph everthing blue if one line or point is listed or the entire area meeting that criteria
1/6th is to the left of 1 standard deviation & to the right of 1 standard deviation or about 16.7%
chance of six sigma occuring 3.4 out of 1mil chances
six sigma graph goes up to 6 standard deviations
in quality if something is 3 or more standard deviations indicates there is a problem in the process that likely needs to be address
F critical usually between 2-4
if f value is higher than f critical value reject null, accept alternative
forecasting attempting to predict the future and/or events that will happen in a different situation to make the best possible decistion
in order to forecast determine whether there are patters in past events & determine which info is most likely to lead to accurae forecast
forecasting techniques regression analysis, time series analysis, cluster analysis, decision analysis
regression analysis technique find a best-fit line thorugh data points that most closely matches points
regression analysis benefits allows sophisticated analysis of cost behavior sales forecasts
regression analysis benefits provides objective benchmarks for evaluation of realiability of estimates
regression analysis disadvantages requires 15 or more data points for accuracy
regression analysis disadvantages can be influeced by outliers
regression analysis disadvantages requires informed analysis
Time series analysis technique Use past results to predict events that will occur in the future
Time series analysis benefits Aids decision making by finding patterns in data, such as sales trends
Time series analysis benefits Allows performance and productivity evaluation
Time series analysis shortcomings Assumes past data patterns will repeat in future, which may not be true
Time series analysis shortcomings Key variables may not be captured
Cluster analysis technique Plot a series of data points and look for trends or patterns that increase our understanding
Cluster analysis benefits Sorts individual data points into different groups
Cluster analysis benefits - Helps determine target markets
Cluster analysis benefits Identifies successful and unsuccessful habits and systems
Cluster analysis disadvantages Long and expensive process
Cluster analysis disadvantages There are hundreds of potential approaches to take, each specific to a certain situation
Decision analysis technique Organized analysis of a series of decisions, events, and the value of those outcomes to determine the decision most likely to give us the best outcome.
Decision analysis technique A Decision Tree is one example.
Decision analysis benefits Determines the decision with the greatest value
Decision analysis benefits Produces a value under certainty, uncertainty, and risk
Decision analysis disadvantages Quality of decision is limited to the amount of data available
Decision analysis disadvantages Does not emphasize the risk of the worst case scenario
Regression Analysis Fitting a trend line to historical data points to project into the medium to long-range
Regression Analysis: Linear trends can be found using the least squares technique
regression analysis: least squares method minimizes the sm of the squared errors (deviations)
least squares method map straight trend line with deviations hanging onto it (above and below)
multiple regression when regression has more than 1 independent variable
multiple regression hard to graph more than 3 dimensions
logistic regression used for categorical variables (on/off, brand A/Brand B) & non-linear regression line (ie curved)
autocorrection occurs when a given daata point on a time series analysis is affected by a previous data point for that time serires
in ordinary regression analysis we assume that errors are independent from one another
Autoregressive Error Correction produces a superior regression analysis compared to ordinary regression analysis because it takes autocorrelation into account.
autocorrelation If the previous day was sunny and hot, it is not very likely it will snow that day. It is more likely if there was snow on the previous day.
Homoscedasticity: The Variability of the data is similar for all values of the variables
Heteroscedasticity: The Variability of the data changes as we move through different values of the variables.
Forecasts are seldom perfect
Most techniques assume an underlying stability in the system
Product family and aggregated forecasts are more accurate than individual product forecasts
time-series models 1. Moving average 2. Weighted Moving averages 3. Exponential smoothing 4. Trend projection
associative model Linear regression
Time Series Forecasting Set of evenly spaced numerical data
Time Series Forecasting Obtained by observing response variable at regular time periods
Time Series Forecasting Forecast based only on past values, no other variables important
Time Series Forecasting Assumes that factors influencing past and present will continue influence in future
Time Series Components trend, cyclical, seasonal, random
time seris gragh - average demand average demand horizontal line
time series graph - trend compoenet upward or downward straight line
time series graph - actual demand wildly swinging up and down line
Cluster analysis also known as segmentation
Cluster analysis the process of arranging terms or values based on different variables into "natural" groups
Most often with cluster analysis these terms or values are survey responses from people.
cluster analysis There are hundreds of approaches& it is used in many different fields to have a better understanding of an industry's environment.
Cluster Analysis often a trial-and-error approach to find natural groupings
Decision analysis can be used to develop an optimal strategy when a decision maker is faced with several decision alternatives and an uncertain or risk-filled pattern of future events.
Even when a careful decision analysis has been conducted the uncertain future events make the final consequence uncertain.
The risk associated with any decision alternative is a direct result of the uncertainty associated with the final consequence
A good decision analysis includes risk analysis that provides probability information about the favorable as well as the unfavorable consequences that may occur.
A decision problem is characterized by decision alternatives, states of nature, and resulting payoffs.
decision alternatives are the different possible strategies the decision maker can employ.
states of nature future events, not under the control of the decision maker, which may occur
states of nature should be defined so that they are mutually exclusive and collectively exhaustive.
Expected Value Approach If probabilistic information regarding the states of nature is available, one may use this
Expected Value Approach the expected return for each decision is calculated by summing the products of the payoff under each state of nature and the probability of the respective state of nature occurring.
Expected Value Approach The decision yielding the best expected return is chosen.
PDC decision tree: expected values Choose the decision alternative with the largest EV.
Chi-Square Test Used with frequency of events
Chi-Square Test The number of events, not percentages
Chi-Square Test Event counts for each cell must be greater than 5
Compare our statistical test results with target number in a Chi-Square table
Chi-Square Test - Null Hypothesis often not very exciting
Chi-Square Test - Null Hypothesis no relationship between event variables
Chi-Square Test - Null Hypothesis relationship follows known theory or fact
Chi-Square Test - Null Hypothesis Choose a statistical significance (p-value)
Chi-Square Test: a p-value of 5% means that the data we found would support the Null Hypothesis only 5% or less of the time, and that we can be 95% confident that we can reject the Null Hypothesis
t-test & ANOVA the same process pretty much
if p is low null must go
t-test 2 samples
ANOVA more than 2 samples
time series use when time periods are involved
linear regression same as least squares
logistics regression better choice when dealing with curved line or categories
exponential smoothing fancy way to do weighted moving averages but it's commonly used
time series models set of evenly spaced numerical data tracked on a consistent time frame ie quarterly whose forecast is based solely on past data
time series models trend adjustment an adjustment to make sure that if sales are growing we are not underestimating by looking at past sales that are lower
time series models cyclical adjustment adjustment made based on business cycles such as recessions or boom times, related to national economic situation
time series models seasonal adjustment calendar based adjustment EX. If most of our sales are in December we will adjust December's forecasts
time series models random adjustment adjustment made based on random events such as a competitor going out of business or having a black Friday sale
cluster analysis often used in biology and many other sciences
cluster analysis usually based on survey data, used a lot by marketing
cluster analysis cares most about what characteristics belong to each group and then decides how to interact with those groups
decision tree: weighted average states of nature multiply each state's percentage by that state's value then add the state's totals together for each component. Choose most money
Quality Management Principles customer focus, leadership, continual improvement, system approach to management, mutually beneficial supplier relationship, process approach, factual approach to decision making, people involvement
Quality Management Principle: Customer focus Organizations depend on their customers and therefore should understand current and future customer needs, should meet customer requirements and strive to exceed customer expectations.
QM customer focus benefit Increased revenue and market share obtained through flexible and fast responses to market opportunities
QM customer focus benefit Increased effectiveness in the use of the organization’s resources to enhance customer satisfaction
QM customer focus benefit Improved customer loyalty leading to repeat business.
Quality Management Principle: Leadership Leaders establish unity of purpose and direction of the organization. They should create and maintain the internal environment in which people can become fully involved in achieving the organization’s objectives.
QM leadership benefit People will understand and be motivated towards the organization’s goals and objectives
QM leadership benefit Activities are evaluated, aligned and implemented in a unified way
QM leadership benefit Miscommunication between levels of an organization will be minimized.
Quality Management Principle: continual improvement Continual improvement of the organization’s overall performance should be a permanent objective of the organization.
QM continual improvement benefit Performance advantage through improved organizational capabilities
QM continual improvement benefit Alignment of improvement activities at all levels to an organization’s strategic intent
QM continual improvement benefit Flexibility to react quickly to opportunities.
Quality Management Principle: system approach to mgmt Identifying, understanding and managing interrelated processes as a system contributes to the organization’s effectiveness and efficiency in achieving its objectives.
QM system approach to mgmt benefit Integration and alignment of the processes that will best achieve the desired results
QM system approach to mgmt benefit Ability to focus effort on the key processes
QM system approach to mgmt benefit Providing confidence to interested parties as to the consistency, effectiveness and efficiency of the organization
Quality Management Principle: mutually beneficial supplier relationship An organization and its suppliers are interdependent and a mutually beneficial relationship enhances the ability of both to create value
QM mutually beneficial supplier relationship benefit Increased ability to create value for both parties
QM mutually beneficial supplier relationship benefit Flexibility and speed of joint responses to changing market or customer needs and expectations
QM mutually beneficial supplier relationship benefit Optimization of costs and resources.
Quality Management Principle: process approach A desired result is achieved more efficiently when activities and related resources are managed as a process.
QM process approach benefit Lower costs and shorter cycle times through effective use of resources
QM process approach benefit Improved, consistent and predictable results
QM process approach benefit Focused and prioritized improvement opportunities.
Quality Management Principle: factual approach to decision making Effective decisions are based on the analysis of data and information
QM factual approach to decision making benefit Informed decisions
QM factual approach to decision making benefit An increased ability to demonstrate the effectiveness of past decisions through reference to factual records
QM factual approach to decision making benefit Increased ability to review, challenge and change opinions and decisions.
Quality Management Principle: people involvement People at all levels are the essence of an organization and their full involvement enables their abilities to be used for the organization’s benefit.
QM people involvement benefit Motivated, committed and involved people within the organization
QM people involvement benefit Innovation and creativity in furthering the organization’s objectives
QM people involvement benefit People being accountable for their own performance
QM people involvement benefit People eager to participate in and contribute to continual improvement.
Shewhart's PDCA Model Plan-Do-Check-Act
Shewhart's PDCA Model: Plan Identify the pattern and make a plan
Shewhart's PDCA Model: Do Test the plan
Shewhart's PDCA Model: Check Is the plan working?
Shewhart's PDCA Model: Act Implement the plan document
Supplier-Input-Process-Output-Customer (SIPOC) a high-level view of a process
SIPOC: S supplier person/org that provices input to a process
SIPOC: I input resource that is added to a process by a supplier
SIPOC: P process series of steps where an input converts to an output
SIPOC: O output resource that is the result of a process
SIPOC: C customer person/org that receives products or services
quality assurance An overall management plan to guarantee the integrity of data (The “system”)
quality control A series of analytical measurements used to assess the quality of the analytical data (The “tools”)
quality assurance prevention
quality control detection
quality control: focus Uncover defects so they can be fixed
quality assurance: focus Prevent defects from occurring
quality control: purpose Assess performance and recommend corrective action
quality assurance: purpose Assess capability and recommend preventive action
quality control: level Basic—recognize problems so they can be fixed
quality assurance: level Advanced—understand the intricacies of the system and predict outcomes
quality control: major activities Inspection and repair
quality assurance: major activities Design and Training
quality control: change response Reactive—take action once the problem has occurred
quality assurance: change response Proactive—take action before the problem can occur
quality control C - average, basic
quality assurance A - superior, proactive
Seven Tools of TQM Total Quality Management) Check sheets, Scatter diagrams, Cause-and-effect diagrams, Pareto charts, Flowcharts, Histogram, Statistical process control chart
Check Sheet: An organized method of recording data (looks like a table with tick marks)
Scatter Diagram A graph of the value of one variable vs. another variable (unconnected dots on graph)
Cause-and-Effect Diagram A tool that identifies process elements (causes) that might effect an outcome
Cause-and-Effect Diagram looks like a decision tree with the branches on the left (cause) & box (effect) on right
Flowchart (Process Diagram): A chart that describes the steps in a process
Histogram A distribution showing the frequency of occurrences of a variable
Histogram looks like a bar chart in bell curve shape
Pareto Chart A graph to identify and plot problems or defects in descending order of frequency
Pareto Chart graph starts high and then tails off, can be lines or bars with a ball chain dotted line over them
Statistical Process Control Chart A chart with time on the horizontal axis to plot values of a statistic
Statistical Process Control Chart chart with a solid target value line and dotted upper & lower control limit lines
Statistical Process Control (SPC) Uses statistics and control charts to tell when to take corrective action
Statistical Process Control (SPC) Drives process improvement
Statistical Process Control (SPC): 4 key steps Measure the process
Statistical Process Control (SPC): 4 key steps When a change is indicated, find the assignable cause
Statistical Process Control (SPC): 4 key steps Eliminate or incorporate the cause
Statistical Process Control (SPC): 4 key steps Restart the revised process
Statistical Process Control (SPC) Variability is inherent in every process: Natural or common causes AND/OR Special or assignable causes
Statistical Process Control (SPC) Provides a statistical signal when assignable causes are present
Statistical Process Control (SPC) Detect and eliminate assignable causes of variation
An SPC Chart Types of Data: Variables For variables that have continuous dimensions such as Weight, speed, length, etc.
x-charts are to control the central tendency of the process
R-charts are to control the dispersion of the process
An SPC Chart Types of Data: Attributes Defect-related characteristics
An SPC Chart Types of Data: Attributes Classify products as either good or bad or count defects
An SPC Chart Types of Data: Attributes Categorical or discrete random variables
SPC chart: anything outside of the upper & lower limits is out of control
SPC chart: anything outside of the upper & lower limits variation due to assignable causes
SPC chart: anything inside the upper & lower limits variation due to natural causes
Control Charts for Attributes For variables that are categorical such as: Good/bad, yes/no, acceptable/unacceptable
Control Charts for Attributes Measurement is typically counting defectives
Control Charts for Attributes Charts may measure: Percent defective (p-chart) or Number of defects (c-chart)
Control Charts for Attributes: P-chart measures percent defective (p percent)
Control Charts for Attributes: C-chart measures number of defects (c count)
The Seven New Tools for Improvement affinity diagram, tree diagrams, interelationship digraph, process decision program chart, activity network diagram, matrix diagram, prioritization grid
New tools to analyze non-numerical data: affinity diagram A tool that groups items based on relationships which are then analyzed
New tools to analyze non-numerical data: affinity diagram Brainstorming tool that organizes large amounts of disorganized data and information into groupings based on natural relationships.
New tools to analyze non-numerical data: affinity diagram used when: You are confronted with many facts or ideas in apparent chaos.
New tools to analyze non-numerical data: affinity diagram used when: Issues seem too large and complex to grasp.
New tools to analyze non-numerical data: affinity diagram columns with a category heading with "stickies" under each heading
New tools to analyze non-numerical data: Interrelationship digraph info in boxes stuck in a perimeter with arows drawn between boxes in different colors denoting a relationships between boxes
New tools to analyze non-numerical data: Interrelationship digraph This tool displays all the interrelated cause-and-effect relationships and factors involved in a complex problem and describes desired outcomes.
New tools to analyze non-numerical data: Interrelationship digraph The process of creating an interrelationship digraph helps analyze the natural links between different aspects of a complex situation.
New tools to analyze non-numerical data: Interrelationship digraph The process of creating an interrelationship digraph helps analyze the natural links between different aspects of a complex situation.
New tools to analyze non-numerical data: Tree diagram basically a flow chart with double slanted lines coming out of subordinate boxes
New tools to analyze non-numerical data: Tree diagram A hierarchical tool that breaks a topic down into its components
New tools to analyze non-numerical data: Tree diagram Breaks down broad categories into finer and finer levels of detail.
New tools to analyze non-numerical data: Tree diagram Developing a tree diagram directs concentration from generalities to specifics.
New tools to analyze non-numerical data: Prioritization matrix grid table
New tools to analyze non-numerical data: Prioritization matrix A table or chart that helps a team prioritize multiple options, based on how well these options satisfy preselected criteria
New tools to analyze non-numerical data: Prioritization matrix Prioritizes items in terms of weighted criteria.
New tools to analyze non-numerical data: Prioritization matrix It uses a combination of tree and matrix diagramming techniques to do a pair-wise evaluation of items and to narrow down options to the most desired or most effective.
Popular applications for the Prioritization Matrix include return on investment(ROI) or Cost - Benefit Analysis (investment vs. return), time management matrix (urgency vs. importance), etc
New tools to analyze non-numerical data: Matrix diagram grid table that uses symbols
New tools to analyze non-numerical data: Matrix diagram A table or chart that shows the strength of the relationships between items or sets of items
New tools to analyze non-numerical data: Matrix diagram At each intersection a relationship is either absent or present.
New tools to analyze non-numerical data: Matrix diagram It then gives information about the relationship, such as its strength, the roles played by various individuals or measurements.
New tools to analyze non-numerical data: Process decision program chart Like a tree diagram, the PDPC shows a hierarchy of events or ideas but the intent of the PDPC is more defined
New tools to analyze non-numerical data: Process decision program chart the lowest tier of the PDPC illustrates the corrective and preventive actions that can be taken to mitigate risks or overcome process problems.
New tools to analyze non-numerical data: Process decision program chart Specifically designed to help teams mitigate risks and solve potential problems.
New tools to analyze non-numerical data: Process decision program chart Different shaped boxes are used to highlight risks and identify possible countermeasures
PDPC Process decision program chart
New tools to analyze non-numerical data: Network diagram A decision diagram with a green ball on the left & a red ball on the right
New tools to analyze non-numerical data: Network diagram A scheduling diagram that shows the relationships between project activities
New tools to analyze non-numerical data: Network diagram It is used when subtasks occur in parallel.
New tools to analyze non-numerical data: Network diagram The diagram helps in determining the critical path (longest sequence of tasks)
critial path longest sequence of tasks
TQM Encompasses entire organization, from supplier to customer
TQM Stresses a commitment by management to have a continuing, companywide drive toward excellence in all aspects of products and services that are important to the customer
TQM encompasses Continuous improvement, Statistical Quality Control, Employee empowerment, Benchmarking, Just-in-time (JIT), Knowledge of TQM tools
Lean Operations are externally focused on the customer
Lean Operations Emphasis on understanding the customer and what the customer wants
Lean Operations Optimizes the entire process from the customer’s perspective
Lean classifies every activity that we do into three types value add, non value-add but essential & waste
Lean Ops: Value Add activities that a customer would be willing to pay for which help create the final form or function of the finished article
Lean Ops: Non Value Add but Essential things that need to be done, but that don’t bring any value to the finished article (e.g. waiting for a document to print, the time it takes for paint to dry etc.)
Lean Ops: Waste actions that bring no value to the article and are therefore unnecessary
Just-In-Time (JIT) Powerful strategy for improving operations
Just-In-Time (JIT) Materials arrive where they are needed when they are needed
Just-In-Time (JIT) Identifying problems and driving out waste reduces costs and variability and improves throughput
Just-In-Time (JIT) Requires a meaningful buyer-supplier relationship
Six Sigma Statistical definition of a process that is 99.9997% capable, 3.4 defects per million opportunities (DPMO)
Six Sigma: DPMO defects per million opportunities
Six Sigma A program designed to reduce defects, lower costs, and improve customer satisfaction
Six Sigma graphing uses bell curve graph
Six Sigma graphing 3 standard deviations is 2,700 defects/million
Six Sigma graphing 6 standard deviations are 3.4 defects/million
Lean Six Sigma It combines the streamlining and waste-elimination concepts of Lean practices with the variation- and qualitycontrol ideas of Six Sigma
Combining Lean's focus on enhancing customer value with Six Sigma's optimization of process work simultaneously reduces inefficiency, accelerates production, and increases quality
Design for Six Sigma (DFSS) methodology does not wait to correct inefficiencies in processes
Design for Six Sigma (DFSS) methodology it incorporates Six Sigma practices into the design of the processes
By taking a proactive approach to quality management, DFSS ensures that variation is minimized from the outset of a project, eliminating the need for corrective actions later in project work.
Reliability Generally defined as the ability of a product to perform as expected over time
Reliability Formally defined as the probability that a product, piece of equipment, or system performs its intended function for a stated period of time under specified operating conditions
Types of Reliability inherent & achieved
Inherent reliability predicted by product design (robust design)
Achieved reliability observed during use
R1 = reliability of component 1
Rs = R1 x R2 x R3 x … x Rn
Reliability of the process is Rs = R1 x R2 x R3 = .90 x .80 x .99 = .713 or 71.3%
ISO The International Organization for Standardization
The International Organization for Standardization (ISO) established a certification program to guarantee that organizations are dedicated to quality concepts and are continually working to ensure the highest level of quality possible
ISO Certification shows that an organization has a quality management system in place to monitor and control quality issues and is continuing to meet the needs of customers and stakeholders with highquality products and services.
International Organization for Standardization (ISO)- mission is to promote the development of standardized products to facilitate trade and cooperation across national borders
International Organization for Standardization (ISO)- Representatives from more than 146 nations
ISO 9000 series of standards sets requirements for quality processes
Nearly half a million ISO 9000 certificates have been awarded to companies around the world.
ISO 14000 series also sets standards for operations that minimize harm to the environment.
ISO Developed the quality management principles
80/20 rule 80% of issues are caused by 20% of issues
check sheet counts how many times something occurs
cause/effect SIPOC diagram top branches are primary issues, wings off branches are secondary issues
pareto chart black dots cumulative effects of issues
run chart shows data in relation to a target value
control chart run chart with control limits added.
statistical process control chart control chart calcuated via statistical means or processes
statistical process control chart tells WHEN to take corrective action not what to do
statistical process control chart early warning system
natural or common cause variation in control
special or assignable cause variation out of control, created by something in the process, what you're looking for on SPC chart
trends within control limit can also indicate a system issue
x-charts area within the control limits is shaded in
7 new tools for improvement AKA Ishikawa tools
order of 7 new tool use affinity-> interrelationional -> tree-> prioritization matrix or matrix diagram -> process decision program chart -> network diagram
lean ops waste from a customer perspective inventory, movement to warehousing and quality programs would be considered this
Just-In-Time (JIT) internal operations perspective
Just-In-Time (JIT) opposite of lean ops focus, often run in conjuction with lean ops
6 standard deviations from mean 99.997% perfection
R S subscripted overall sysem reliablity
RBM Results based management
results based management (RBM) uses results as the central measure of performance
results based management (RBM) translates goals into results, clearly defined accountability of results & requires monitoring and self-assessment
results based management (RBM) Takes a life-cycle approach
results based management (RBM) continuous measurement and performance evaluation; must be measureable using data
Steps of RBM Resources: inputs -> activities -> results: outputs -> outcomes -> impact
RBM requires partnerships and inclusiveness; shared expectations; & transparency, simplicity, and flexibility
Results-based managedment tools table includes expected results, indicators, baseline data, targets, data sources, data collection methods, frequency & responsbility
Performance Measures Used to measure results, effectiveness, and/or efficiency of an individual, group or the entire company
Performance Measures Answer such questions as: How are we doing? What do we need to do to improve? Where is problem solving needed?
Performance Measures Should be linked to a company’s goals and/or strategy e.g., goal is to gain market share by improving customer satisfaction from 70% to 80%
Performance Indicators virtually anything that can be tracked and quantified,
Performance Indicators examples include: Financial performance, Customer satisfaction, Quality of programs or services, Employee retention, Safety statistics, Energy consumption
Business Improvement Analytics: Index numbers are a common analytic for business improvement
Business Improvement Analytics: Index numbers commonly represent the change in price or quantity over time for goods and/or services EX: Consumer Price Index
Business Improvement Analytics: Consumer price index “basket” of assorted consumer goods and services that are purchased by a common household
ConBusiness Improvement Analytics: sumer price index watched closely because considered a main measure of inflation
Business Improvement Analytics: CPI Consumer Price Index
Business Improvement Analytics: CPI Basket includes communication, healthcare, education, transportation, recreation, food, housing & clothing
Business Improvement Analytics: Indices are usually relative to a base period that is represented as a value of 100
Business Improvement Analytics: Index graph Average is shown in addition to actual
Types of Indices: Simple Index Number, Simple Composite Index & Weighted Composite Index
Business Improvement Analytics: Simple Index Number Price or quantity relative to a base period of 100
Price of Big Mac in 1968=$1.60, Price of Big Mac in 2014 = $4.80; express the Simple Index Number for the 2014 price of a Big Mac, using the 1968 price as the Base Period Answer = ($4.80 / $1.60) * 100 = 300 Note: 1968 price as a simple index = ($1.60 / $1.60) * 100 = 100
Business Improvement Analytics: Simple Composite Index an index based on a combination of items/measures without weighting any data more significantly than any other data
suppose a Brand Equity index is created based on 5-point ratings of Quality, Value, and Uniqueness. ; Assume the average summed ratings score is 11.5 across 300 brands What is the Simple Composite “Brand Equity” Index for a brand with a summed rating score of 12.7? = (12.7 / 11.5) * 100 = 110.4 = 110
Weighted Composite Index similar to a Simple Composite Index except that certain variables are given more weight than others when calculating the index
Business Improvement Analytics: Weighted Composite Index e.g., suppose that consumer perceptions of brand Quality and Value are more important than Uniqueness in driving brand sales
Business Improvement Analytics: Weighted Composite Index Could create a “Brand Equity” Index with the following weights ; 50% weight to perceived Quality ; 30% weight to perceived Value ; 20% weight to perceived Uniqueness
Health Care Analytics: Comonly used metrics - Rate frequency of an event per time period ; e.g. birth rate - # births per 1,000 people in a year
Health Care Analytics: Comonly used metrics - Ratio – measure of one quantity in relation to another ; e.g., gender ratio for Alzheimer’s ~ 2: 1 females to males
Health Care Analytics: Comonly used metrics - Proportion ratio of a group to the whole
Health Care Analytics: Comonly used metrics - Prevalence (number cases) / (total population) ; e.g., 20% of African Americans over 75+ yrs old have Alzheimer’s
Health Care Analytics: Comonly used metrics - Incidence only considers new cases
Health Care Analytics: Comonly used metrics - cumulative Incidence (# new cases in particular time) / (population)
Health Care Analytics: incidence rate (# new cases) / (person-time units) ; person-time units -> cumulative amount of time each person was studied
Education Analytics: Test Construction - Norm Referenced Tests compare an individual to others ; e.g., standard score (Z-score)
Education Analytics: Test Construction - Criterion Referenced Tests compare an individual to defined standards ; e.g., exam cut-score
Education Analytics: Test Construction - True Score Theory for a test without systematic error, the observed score is the true score plus random error
Education Analytics: Test Construction - True Score Theory: systematic error occurs when something unrelated to the test per se is affecting the results (e.g., jack-hammering was going on outside test center)
Education Analytics: Test Construction - Item Response Theory takes into account that different questions have different levels of difficulty ; e.g., SAT and GMAT
Government/Public Sector Analytics Pressures to deliver public services at lower costs ; i.e., reform spending habits and optimize the allocation of public benefits (public welfare)
Government/Public Sector Analytics: Cost-benefit analysis In public sector, not as straightforward (e.g., global warming; spending with no revenue) ; Attempt to measure the benefit to the general welfare of the public
Government/Public Sector Analytics:; Benchmarking e.g., anticipated cost of new transit system relative to actual cost of similar transit system of other cities
Government/Public Sector Analytics: Payback Period e.g., installing solar panels on municipal buildings
Non-Profit Analytics: ; Cost-effectiveness of initiatives Determine a quantifiable goal ; Analyze the progress, success, and cost of achieving the predetermined goal ; e.g., WGU “online, accelerated, affordable, accredited” ; WGU has not raised tuition rates since 2008
Non-Profit Analytics: Compare results to that of other non-profits
Private sector cost benefit analysis In private sector, seek Revenues > Cost
CPI base period used 1982-1984 = 100, all else relative to that
simple composite index our score/industry averge * 100
person-time units cumalative amount of time each person was studied
Key Performance Indicators (KPIs) KPIs are performance measures that organizations use to quantify their level of success
Key Performance Indicators (KPIs) examples include: customer/patient satisfaction, sales increase, employee turnover, return customer rate
Management/Employee bonuses can be tied to KPIs
Key Performance Indicators (KPIs) often follow “SMART” criteria
Key Performance Indicators (KPIs): SMART Specific, Measureable, Attainable, Relevant, Time-bound
KPI Dashboards Visual representation of multiple KPIs
KPI Dashboards include: key areas of focus; often for seeing historical trends; can readily see if organization is meeting its goals
KPI Dashboards use when one chart, graph or piece of data does not provide enough info to make a decision
Advantages of KPIs help company track financial, productivity, etc. goals
Advantages of KPIs data-driven results that make it easier to quantify performance
Advantages of KPIs can be used as a tool across an entire organization
Advantages of KPIs internal benchmarking
Disadvantages of KPIs can be expensive and requires ongoing maintenance
Disadvantages of KPIs small KPI changes that are not statistically significant might be mistakenly viewed as meaningful
Disadvantages of KPIs might lead to focus on short-term, rather than long-term, gains
A Balanced Scorecard measures an organization’s performance on a balanced mix of financial and nonfinancial measures
balanced scorecard: variance actual performance - target performance
Balaned Scorecard Measures 4 perspectives Financial 2. Customer 3. Internal business processes 4. Innovation and learning
Balaned Scorecard Measures goal: positive impact on the company’s long-term performance
Balances scorecards include: mission/vision, objectives, measures, targets & initiatives
Balanced Scorecards Advantages better organization alignment
Balanced Scorecards Advantages better communication
Balanced Scorecards Advantages links operations with company strategy
Balanced Scorecards Advantages emphasizes strategy and organizational results
Balanced Scorecards Disadvantages requires time & effort to develop a meaningful scorecard
Balanced Scorecards Disadvantages challenges for cross-company adoption
Balanced Scorecards Disadvantages may not encourage desired behavior changes
Net Promoter Score (NPS®) considered a measure of customer loyalty
Net Promoter Score (NPS®) developed by Fred Reichheld (Bain & Company) in 2003
Calculate NPS 1. Ask respondents a single, 11-point (0 to 10) question… How likely would you recommend this product or service to a friend?
Calculate NPS 2. Categorize respondents into 3 groups: Detractors, Passives & Promoters
Calculate NPS 3. NPS = % Promoters - % Detractors
Should you proceed NPS only if your NPS is > or = to industry NPS
NPS Advantages easily understood metric
NPS Advantages easy to calculate and monitor
NPS Advantages ; can benchmark against industry leaders
NSP Disadvantages doesn’t consider in-depth customer perception data
NSP Disadvantages may fail to predict loyalty behaviors
NSP Disadvantages doesn’t address specific areas of dissatisfaction (or satisfaction)
NSP Disadvantages “likelihood to recommend” is no better a question (and may even be worse) than “overall satisfaction” or “overall liking”
NSP Disadvantages 11-point scale may have low predictive validity
What type of graphical option could one use to show a visual summary of table data? bar chart
What statistical test could be used to determine if there are significant differences in incomes among the 3 towns? ANOVA
z-TEST 1 sample mean & 1 population mean, known population standard deviation
1 sample t-test 1 sample mean & 1 population mean, unknown population standard deviation, known sample standard deviation
independent t-test 2 sample means, unrelated samples
paried samples t-test 2 sample means, related samples
3+ sample means ANOVA
What type of graphical option could one use to show a visual depiction of data showing the year, quarter & number of housing starts? line (trend) graph
What analytic approach should be used to determine if there is a linear trend in housing starts over time? regression/correlation
What analytic approach should be used to forecast quarterly housing starts for the next two years? regression
chi square need to know relationship between variables, both nominal (frequency counts)
correlation need to know relationship between variables, both interval or ratio (measuring something), relationship
regression (1 independent variable) or mutliple regression (2+ independent variable) need to know relationship between variables, both interval or ratio (measuring something), predition/trend
logic regression need to know relationship between variables, binary dependent variable & independent variables: interval or ratio (measuring something)
What type of analysis would you use to compare 3 different compounds that need to be mixed together to create concrete with constraints? linear programming
How might one display this data on county firetrucks for possible sale including town, mileage, age in years & selling price? scatter plots
How could one summarize the relationship between Selling Price and either Mileage or Age? Correlations
What analytical approach would be used to determine if Selling Price can be predicted from Mileage and/or Age? Multiple Regression
What analytical approach would be used to determine the number of units Trinity would need to sell to cover its costs? Break-Even Analysis
How would you visually depict break-even analysis results Line Graph (Break-Even Chart)
breakeven line graph anything on the left of the break even point between revenue and cost are losses
breakeven line graph anything on the right of the break even point between revenue and cost are profits
What analytic approach should be used when calculating whether montly labor costs on a new project experiences monthly cost over-runs determination of z-scores
What measues should be calculated when tracking whether montly labor costs on a new project experiences monthly cost over-runs upper & lower control limits
What type of visual display should be used to show whether there are any monthly cost over-runs on a chart showing the month and costs for a project? statistical process control chart
NPS Promoter 9 -10
NPS Passives 7-8
NPS detractors 7 or less
can you have a negative net promotor score? yes
NPS doesn't explain why they would or would not recommend
The key focus of the analytics is to predict trends using quantitative data
use data to inform you of your options and assist you in making decisions.
Nominal data is categorical. It has no numeric value. Ex. Meatball, veggie, and cheese pizzas. These are labels. They can’t be added or subtracted.
Ordinal data is ranked, but doesn’t have a specific value. Small, Medium, and large are examples
Interval data is numeric. You can add and subtract it. It has a sequential value. Each value is equally spaced from the previous value. You serve 16oz, 20 oz and 24 oz soft drinks. These are equally spaced apart.
Ratio data Your sales per day are ratio data. 10 sales of $12.99 a piece = $129.99. The value has a true value from zero.
outlier Last week you were closed for two days for renovations. Sales were a zero for those two days.
when outliers are present Do you include the two zeros when calculate the average sales for the week? Do it both ways so you know how it’s effecting your bottom line.
Random happens just once and will not repeat over time. If you’re trying to find average delivery times and one delivery was effected by a four hour Chicago traffic delay
Systematic not by chance & repeats. Frank has nursed the fuel injector on his car for the past 6 months. It breaks down 1out of every 20 deliveries he makes.
omission error the driver didn’t clock in or out for his delivery. That data will not be included in your study and it’s relevant
distorted A data set with an omission error
out of range error. Your range is 0 – 15 miles. Delivery 14 and 16 are not in the scope of what you are measuring because they are too far
reduce errors a survey your customer takes via text only allows them to select responses from a list. That way they can’t type anything in wrong.
treatment You are treating the pan three different ways with the three amounts of oil.
blind study The manager doesn’t know the treatment of each pizza. You do.
double-blind study Let’s say you had someone else prepare the pizzas, so you don’t know which one is which. You set out the pizzas for your staff and record the results of each of their taste tests.
PDCA: Plan List every reason a shipment could be delivered to the wrong warehouse. Make a list of possible solutions and select the one you want to try.
check sheet collect when an error occurs for each shipping location. Tally sheet
cause and effect diagram brainstorm the issues casing the shipping error.
Pareto shows us which “cause” happens the most because sort it highest to lowest order
Pareto a bar graph with categories shown in descending order by frequency.
PDCA: Do Implement the solution on a very small scale. We have one clerk trialing a new data entry format when she enters the orders in the computer.
PDCA: Do We should create a flow chart to clearly define the steps in the new process for the clerk.
PDCA: Check Have we had greater success with program we are piloting with the one clerk? Yes.
PDCA: Check To know if we’re getting better, we look at our performance over time. In Six weeks let’s see if shipping errors decreased.
run chart track errors for the last 6 weeks
PDCA: Act We accept the trial method as our new process and implement it with all of the shipping clerks.
Plan nothing’s been done.
Check we are measuring what’s been tried
control chart tells if we stayed within limits we set. (54 – 60 inches)
Ishikawa claims quality tools can solve 90-95% of quality problems
SIPOC diagram list all of the elements that can influence a process before it starts
Six Sigma When we want to correct a part of the process, we can use this as a problem solving method.
histogram showing the number of returned items for each year from 2012-2016.
histogram used rather than a bar chart because our x-axis values are numeric (different years in this case) rather than categories.
scatterplot For each of the 25 manufacturing employees, a point is plotted for their number of overtime hours and the number of errors made in the past 6 months.
Created by: seamonkeyink
 

 



Voices

Use these flashcards to help memorize information. Look at the large card and try to recall what is on the other side. Then click the card to flip it. If you knew the answer, click the green Know box. Otherwise, click the red Don't know box.

When you've placed seven or more cards in the Don't know box, click "retry" to try those cards again.

If you've accidentally put the card in the wrong box, just click on the card to take it out of the box.

You can also use your keyboard to move the cards as follows:

If you are logged in to your account, this website will remember which cards you know and don't know so that they are in the same box the next time you log in.

When you need a break, try one of the other activities listed below the flashcards like Matching, Snowman, or Hungry Bug. Although it may feel like you're playing a game, your brain is still making more connections with the information to help you out.

To see how well you know the information, try the Quiz or Test activity.

Pass complete!
"Know" box contains:
Time elapsed:
Retries:
restart all cards