SPLUNK ENTERPRISE CERTIFIED
ARCHITECT EXAM
What two items will a deployment plan give you a solid formation for what?
Scale Splunk deployments as they evolve
Funding plans
Implement large, enterprise deployments
Management justifications - Correct Answers -Scale Splunk deployments as they evolve
Implement large, enterprise deployments
Which of the following are not items in a deployment plan?
Deployment Goals
User roles / staffing list
Current Physical and Logging Diagrams
Splunk Deployment Topology
Backup Process
Data Source Inventory
Data Policy Definition
Suggested Splunkapps
Education / Training Plan
Start Ingesting Data
Deployment Schedule - Correct Answers -Backup process
Start ingesting data
In the Splunk Deployment Process, what are the three timeframes for this (what two
don't belong)
Funding
Infrastructure Planning and Build
Splunk Deployment and Data Enrichment
User planning and roll out
User Follow Up / Discussion - Correct Answers -Funding
User Follow Up / Discussion
In the Splunk Deployment Process, when is the Infrastructure Build Complete
considered to occur?
,After the Infrastructure Planning and Build and Infrastructure is done - Correct Answers
-
In the Splunk Deployment Process, name some Architect Tasks - Correct Answers -
Architecture Requirements Review
Hardware Procurement
Hardware Setup and Configurations
Network Config and Connectivity
Install Splunk (search heads, indexers, forwarders
HA / DR Implementation
Identify Data Sources
Forwarder Allocation
In the Splunk Deployment Process, which of the following are NOT Administrator Tasks
Splunk Admin Training
Forwarder Data Initiation
Define Processes to ingest new data sources
Validate Data Sources and Normalize Data
Initial Searches, Alerts and Dashboards
Deliver Use Case, Dashboards and Reports
Internal Projects Communications
Deploying Splunk to New Environments
Role User Based Training
Additional Use Case Iterations - Correct Answers -Define Processes to ingest new data
sources - Admins will work with users to onboard data and prioritize requests
Deploying Splunk to New Environments
In the Splunk Deployment Process, installing the Search Heads, Indexers and
Forwarders is considered an Administrator task (T/F) - Correct Answers -False -
considered an Architect task
In the Splunk Deployment Process, Splunk Admin Training is considered an
Administrator task (T/F) - Correct Answers -True
Gathering raw material for the deployment should be done when? Beginning, middle, or
end - Correct Answers -Beginning
What sort of items should NOT be considered for the raw material for the deployment?
Overall Goals for the Deployment
Key Users - including goals and use cases
,Determining Splunk Apps to install
Current environment - including the physical environment and current monitoring tools
in use
Expected daily data ingestion
Determine Transforms Strategy for Processing Data
Data Sources - Correct Answers -Determining Splunk Apps to install
Determine Transforms Strategy for Processing Data
For the deployment plan, what sort of information should NOT be gathered about the
current IT topology?
Percentage of Windows to Unix Servers
Data Centers
Network Zones
Number and Type of Servers
Location of Users - Correct Answers -Percentage of Windows to Unix Servers
For the deployment plan, why should you ask for a network diagram?
Because it's standard to do so for any application
To find if there are security restrictions between the data centers and network zones
To find the bandwidth among data center and network zones
Network Diagrams can be mapped into Splunk - Correct Answers -To find if there are
security restrictions between the data centers and network zones
To find the bandwidth among data center and network zones
(T/F) Authentication process is an item that needs to be found in the current IT
environment - Correct Answers -True - Splunk will need to be configured for that
When gathering data about the current logging environment of the customers, what
main question should be asked about log locations? - Correct Answers -Are the logs /
data sources collected or centralized already?
Concerning the current logging environment - what questions should NOT be asked
about any tools in place?
Are there log parsing / scraping tools or scripts in place?
, Are there any query tools in place? Who uses each tool?
Are logs a source for a monitoring system?
What ticketing system is in place?
Is there an expectation that Splunk will integrate with or replace tools? - Correct
Answers -All should be asked
Concerning the customer's general environment - what should be asked about the
security in the environment?
What security policies may affect the collection, retention, and reporting of data?
Can we get root access everywhere?
What approvals will be needed?
Can we use our types of certificates - Correct Answers -What security policies may
affect the collection, retention, and reporting of data?
What approvals will be needed?
Concerning the customer's general environment - what should be asked about the
regulatory matters in the environment? - Correct Answers -Are there laws, regulations,
or policies that affect the data collection, reporting?
Concerning the customer's general environment - what should be asked about HA / DR
in the environment?
Can we adjust the data as necessary to fit your DR situation?
Is data replication required?
Does your data need to be searchable at all times? - Correct Answers -Is data
replication required?
Does your data need to be searchable at all times?
Which of the following questions should be asked for Data Policy?
How long should each data source be retained?
Can we use open permissions on the files to make it easier to see?
Who can see particular data elements?
What data needs protection against tampering?
What proof of integrity needs to be provided?
Can we automatically delete files after a certain time?
Will Splunk be the primary repository for data? - Correct Answers -How long should
each data source be retained?
Who can see particular data elements?
What data needs protection against tampering?
What proof of integrity needs to be provided?
Will Splunk be the primary repository for data?
ARCHITECT EXAM
What two items will a deployment plan give you a solid formation for what?
Scale Splunk deployments as they evolve
Funding plans
Implement large, enterprise deployments
Management justifications - Correct Answers -Scale Splunk deployments as they evolve
Implement large, enterprise deployments
Which of the following are not items in a deployment plan?
Deployment Goals
User roles / staffing list
Current Physical and Logging Diagrams
Splunk Deployment Topology
Backup Process
Data Source Inventory
Data Policy Definition
Suggested Splunkapps
Education / Training Plan
Start Ingesting Data
Deployment Schedule - Correct Answers -Backup process
Start ingesting data
In the Splunk Deployment Process, what are the three timeframes for this (what two
don't belong)
Funding
Infrastructure Planning and Build
Splunk Deployment and Data Enrichment
User planning and roll out
User Follow Up / Discussion - Correct Answers -Funding
User Follow Up / Discussion
In the Splunk Deployment Process, when is the Infrastructure Build Complete
considered to occur?
,After the Infrastructure Planning and Build and Infrastructure is done - Correct Answers
-
In the Splunk Deployment Process, name some Architect Tasks - Correct Answers -
Architecture Requirements Review
Hardware Procurement
Hardware Setup and Configurations
Network Config and Connectivity
Install Splunk (search heads, indexers, forwarders
HA / DR Implementation
Identify Data Sources
Forwarder Allocation
In the Splunk Deployment Process, which of the following are NOT Administrator Tasks
Splunk Admin Training
Forwarder Data Initiation
Define Processes to ingest new data sources
Validate Data Sources and Normalize Data
Initial Searches, Alerts and Dashboards
Deliver Use Case, Dashboards and Reports
Internal Projects Communications
Deploying Splunk to New Environments
Role User Based Training
Additional Use Case Iterations - Correct Answers -Define Processes to ingest new data
sources - Admins will work with users to onboard data and prioritize requests
Deploying Splunk to New Environments
In the Splunk Deployment Process, installing the Search Heads, Indexers and
Forwarders is considered an Administrator task (T/F) - Correct Answers -False -
considered an Architect task
In the Splunk Deployment Process, Splunk Admin Training is considered an
Administrator task (T/F) - Correct Answers -True
Gathering raw material for the deployment should be done when? Beginning, middle, or
end - Correct Answers -Beginning
What sort of items should NOT be considered for the raw material for the deployment?
Overall Goals for the Deployment
Key Users - including goals and use cases
,Determining Splunk Apps to install
Current environment - including the physical environment and current monitoring tools
in use
Expected daily data ingestion
Determine Transforms Strategy for Processing Data
Data Sources - Correct Answers -Determining Splunk Apps to install
Determine Transforms Strategy for Processing Data
For the deployment plan, what sort of information should NOT be gathered about the
current IT topology?
Percentage of Windows to Unix Servers
Data Centers
Network Zones
Number and Type of Servers
Location of Users - Correct Answers -Percentage of Windows to Unix Servers
For the deployment plan, why should you ask for a network diagram?
Because it's standard to do so for any application
To find if there are security restrictions between the data centers and network zones
To find the bandwidth among data center and network zones
Network Diagrams can be mapped into Splunk - Correct Answers -To find if there are
security restrictions between the data centers and network zones
To find the bandwidth among data center and network zones
(T/F) Authentication process is an item that needs to be found in the current IT
environment - Correct Answers -True - Splunk will need to be configured for that
When gathering data about the current logging environment of the customers, what
main question should be asked about log locations? - Correct Answers -Are the logs /
data sources collected or centralized already?
Concerning the current logging environment - what questions should NOT be asked
about any tools in place?
Are there log parsing / scraping tools or scripts in place?
, Are there any query tools in place? Who uses each tool?
Are logs a source for a monitoring system?
What ticketing system is in place?
Is there an expectation that Splunk will integrate with or replace tools? - Correct
Answers -All should be asked
Concerning the customer's general environment - what should be asked about the
security in the environment?
What security policies may affect the collection, retention, and reporting of data?
Can we get root access everywhere?
What approvals will be needed?
Can we use our types of certificates - Correct Answers -What security policies may
affect the collection, retention, and reporting of data?
What approvals will be needed?
Concerning the customer's general environment - what should be asked about the
regulatory matters in the environment? - Correct Answers -Are there laws, regulations,
or policies that affect the data collection, reporting?
Concerning the customer's general environment - what should be asked about HA / DR
in the environment?
Can we adjust the data as necessary to fit your DR situation?
Is data replication required?
Does your data need to be searchable at all times? - Correct Answers -Is data
replication required?
Does your data need to be searchable at all times?
Which of the following questions should be asked for Data Policy?
How long should each data source be retained?
Can we use open permissions on the files to make it easier to see?
Who can see particular data elements?
What data needs protection against tampering?
What proof of integrity needs to be provided?
Can we automatically delete files after a certain time?
Will Splunk be the primary repository for data? - Correct Answers -How long should
each data source be retained?
Who can see particular data elements?
What data needs protection against tampering?
What proof of integrity needs to be provided?
Will Splunk be the primary repository for data?