Save
Upgrade to remove ads
Busy. Please wait.
Log in with Clever
or

show password
Forgot Password?

Don't have an account?  Sign up 
Sign up using Clever
or

Username is available taken
show password


Make sure to remember your password. If you forget it there is no way for StudyStack to send you a reset link. You would need to create a new account.
Your email address is only used to allow you to reset your password. See our Privacy Policy and Terms of Service.


Already a StudyStack user? Log In

Reset Password
Enter the associated with your account, and we'll email you a link to reset your password.
focusNode
Didn't know it?
click below
 
Knew it?
click below
Don't Know
Remaining cards (0)
Know
0:00
Embed Code - If you would like this activity on your web page, copy the script below and paste it into your web page.

  Normal Size     Small Size show me how

SA4

QuestionAnswer
In ID3, Entropy is calculated for every attribute. True
ID3 selects a best attribute that yields maximum Entropy. False
In Decision Tree, the leaf represents an outcome. True
ID3 is an unsupervised learning algorithm. False
ID3 is a supervised learning algorithm. True
The _____ tells us how much uncertainty in S was reduced after splitting set S on attribute A. GAIN
Successor of ID3. C4.5
The ID3 is a ________ algorithm. Classification
Naïve Bayes is a ______ technique with an assumption of independence among predictors. CLASSIFICATION
Neural Network is also referred to as ANN, where ‘A’ stands for _______. ARTIFICIAL
Type of Neural Network where there is no back feedback to improve the nodes in different layers and not much self-learning mechanism. FEED-FORWARD
Three layers of Neural Networks : Input, _______, Output. HIDDEN
Naïve Bayes Algorithm is a classification technique based on Bayes’ Theorem with an assumption of _______________ among predictors. INDEPENDENCE
In Apriori, the _______ is the conditional probability of some item, given you have certain other items in your itemset. CONFIDENCE
The _______ closure property is an Apriori principle which means that All subset of any frequent itemset must also be frequent. DOWNWARD
In Apriori, the ______ is the number of transactions containing the itemset divided by the total number of transactions. SUPPORT
K-means is an ________ learning algorithm. UNSUPERVISED
The ______ is the number of transactions containing the itemset divided by the total number of transactions. SUPPORT
In decision tree, splitting means removing of sub-nodes of a decision node. False
In ID3 algorithm, the first step is to compute for entropy of the dataset. True
In Decision Tree, the node represents an attribute. True
In decision tree, the ______ is the process of dividing a node intro two or more sub-nodes. SPLITTING
In calculating the ______ of the entire data set, we need to calculate the number of positive and negative evidences. ENTROPY
ID3 selects a best attribute that yields ________ Entropy. MINIMUM
The ______ algorithm is a probabilistic machine learning algorithm that can be used in a wide variety of classification tasks. BAYES/NAIVE BAYES
The ______ is a computing system made up of a number of simple, highly interconnected processing elements, which process information by their dynamic state response to external inputs. NEURAL NETWORK
The ____ is a type of Neural Network with more than one hidden layer. MULTILAYER PERCEPTRON
The Truck Brake system diagnosis, vehicle scheduling, routing systems are applications of neural network in: _______. Transportation
K-means is a ____________ algorithm technique. CLUSTERING
A ________ rule is a data mining technique for learning correlations and relations among variables in a database. ASSOCIATION
In Apriori, the _________ is the relative number of transactions which contains an itemset relative to the total transactions. Relative support
The RBF is a type of Neural Networks that stands for: _______ Basis Function. RADIAL
Neural Network is also referred to as ________. ANN
Naïve Bayes Algorithm is a _______ technique based on Bayes’ Theorem with an assumption of independence among predictors. CLASSIFICATION
Apriori 3-step approach: Join, ______, Repeat. PRUNE
In K-means, we must randomly select k ______ from the data set as the initial cluster centroids. Data points
In Apriori, we need to define first the ______ of itemset. SIZE
K-means is a popular ___________ analysis technique for exploring a dataset. CLUSTERING
Apriori is an ________ algorithm. ASSOCIATION
A Regression Tree is a type of DT where the decision variable is Categorical. False
In Decision Tree, the branch represents an outcome. False
A classification tree is a type of DT where the decision variable is _________. CATEGORICAL
The ______ is is the measure of the amount of uncertainty or randomness in data. ENTROPY
The _____ tells us how much uncertainty in S was reduced after splitting set S on attribute A. Information Gain
The entropy is is the measure of the amount of ______ or randomness in data. UNCERTAINTY
Text Classification and Categorization is a Neural Network in ______. Language
The Bayes Theorem allows us to predict the class given a set of features using _______. PROBABILITY
K-means picks points in multi-dimensional space to represent each of the K clusters. These are called __________. CENTROIDS
To compute for the relative support: total number of ______ containing an itemset X / total number of transaction. TRANSACTIONS
In Apriori, the second element we need to define is the ______of the itemset. SUPPORT
In Apriori, the third element we need to define is the ______of the itemset. CONFIDENCE
The _____ is the conditional probability of some item given you have certain other items in your itemset. CONFIDENCE
The attribute with highest entropy will be chosen as node. False
In calculating the Entropy of the entire data set, we need to calculate the number of positive and negative evidences. True
What is the formula in calculating information gain (IG)? Gain = Entropy(S) – I(Attribute)
In the formula P(Class A|Feature 1, Feature 2), P stands for __________. PROBABILITY
In the given set below, which row is a class? Row 1 = LION, DOG, ELEPHANT, GIRAFFE Row 2 = BIG, HEAVY, BROWN, BLACK ROW 1
The main intuition in these types of neural networks is the distance of data points with respect to the center. RADIAL BASIS FUNCTION
In Apriori, the _______ is an itemset that meets the support. FREQUENT/FREQUENT ITEMSET
In Apriori, the _____ step scans the whole database for how frequent 1-itemsets are. JOIN
Apriori learns _______ rules and is applied to a database containing a large number of transactions. ASSOCIATION
In decision tree, the _____ represents the entire population or sample and this further gets divided into two or more homogeneous sets. ROOT NODE
A type of DT where the decision variable is Categorical. CLASSIFICATION TREE
In ID3 algorithm, the first step is to compute for:________. Entropy of dataset
The attribute with ______________ will be chosen as node. Highest Gain Attribute
Speech recognition is a Neural Network in ______. Signals
This type of neural network is an advanced version of Multilayer Perceptron. CONVOLUTIONAL
The Automobile Guidance Systems is an application of neural network in: _______. Automotive
Apriori 3-step approach: Join, Prune, _______. REPEAT
In Apriori, we need to define first the _____of the itemset. SIZE
In Naïve Bayes, the dataset is divided into two parts: feature matrix and ________. RESPONSE VECTOR
The fundamental Naive Bayes assumption is that each feature makes an: independent and _____ contribution to the outcome. EQUAL
An itemset is considered _______ if its support is no less than “minimum support threshold”. FREQUENT
A terminal node is a note with no split. True
In decision tree, a parent node is a node which is divided into sub-nodes. True
In ID3, Entropy is calculated only at the root node. False
The attribute with highest gain attribute will be chosen as node. True
In ID3 algorithm, the first step is to compute for _______ of the dataset. ENTROPY
ID3 is a : _______________ algorithm. SUPERVISED LEARNING
Type of Neural Network with more than one hidden layer. MULTILAYER PERCEPTRON
Apriori approach where itemsets that satisfy the supportand confidence move onto the next round for 2-itemsets. PRUNE
In K-means algorithm, the first step is to choose a value of k number of _____ to be formed. CLUSTERS
The _____ Theorem allows us to predict the class given a set of features using probability. BAYES
The network with more than one hidden layer is called ______________. Multilayer Perceptron
A Classification Tree is a type of DT where the decision variable is Categorical. True
Character recognition is a Neural Network in _____. Images
The ______ means that independently functioning different networks carry out sub-tasks. MODULARITY
In Decision Tree, the leaf represents an attribute. False
Created by: Sangunius
 

 



Voices

Use these flashcards to help memorize information. Look at the large card and try to recall what is on the other side. Then click the card to flip it. If you knew the answer, click the green Know box. Otherwise, click the red Don't know box.

When you've placed seven or more cards in the Don't know box, click "retry" to try those cards again.

If you've accidentally put the card in the wrong box, just click on the card to take it out of the box.

You can also use your keyboard to move the cards as follows:

If you are logged in to your account, this website will remember which cards you know and don't know so that they are in the same box the next time you log in.

When you need a break, try one of the other activities listed below the flashcards like Matching, Snowman, or Hungry Bug. Although it may feel like you're playing a game, your brain is still making more connections with the information to help you out.

To see how well you know the information, try the Quiz or Test activity.

Pass complete!
"Know" box contains:
Time elapsed:
Retries:
restart all cards