Friday, June 19, 2020

DP-300 Administering Relational Databases on Microsoft Azure (beta) Exam

Candidates for this exam are database administrators and data management specialists that manage on-premises and cloud relational databases built on top of Microsoft SQL Server and Microsoft Azure data services.

The Azure Database Administrator implements and manages the operational aspects of cloud-native and hybrid data platform solutions built on Microsoft Azure data services and Microsoft SQL Server. The Azure Database Administrator uses a variety of methods and tools to perform day-to-day operations, including applying knowledge of using T-SQL for administrative management purposes.

This role is responsible for management, availability, security, and performance monitoring and optimization of modern relational database solutions. This role works with the Azure Data Engineer role to manage operational aspects of data platform solutions.

Beta exams are not scored immediately because we are gathering data on the quality of the questions and the exam. Learn more about the value and importance of beta exams.

Part of the requirements for: Microsoft Certified: Azure Database Administrator Associate

Skills measured
Plan and implement data platform resources (15-20%)
Implement a secure environment (15-20%)
Monitor and optimize operational resources (15-20%)
Optimize query performance (5-10%)
Perform automation of tasks (10-15%)
Plan and implement a High Availability and Disaster Recovery (HADR) environment (15-20%)
Perform administration by using T-SQL (10-15%)

Audience Profile
Candidates for this exam are database administrators and data management specialists that  manage on-premises and cloud relational databases built on top of SQL Server and Azure Data  Services.

The Azure Database Administrator implements and manages the operational aspects of cloud-native and hybrid data platform solutions built on Azure Data Services and SQL Server. The Azure Database Administrator uses a variety of methods and tools to perform day-to-day operations, including applying knowledge of using T-SQL for administrative management purposes.

This role is responsible for management, availability, security and performance monitoring and optimization of modern relational database solutions. This role works with the Azure Data Engineer role to manage operational aspects of data platform solutions.

Skills Measured
NOTE: The bullets that appear below each of the skills measured are intended to illustrate how  we are assessing that skill. This list is not definitive or exhaustive.

NOTE: In most cases, exams do NOT cover preview features, and some features will only be added to an exam when they are GA (General Availability).

Plan and Implement Data Platform Resources (15-20%)
Deploy resources by using manual methods
• deploy database offerings on selected platforms
• configure customized deployment templates
• apply patches and updates for hybrid and IaaS deployment


Recommend an appropriate database offering based on specific requirements
• evaluate requirements for the deployment
• evaluate the functional benefits/impact of possible database offerings
• evaluate the scalability of the possible database offering
• evaluate the HA/DR of the possible database offering
• evaluate the security aspects of the possible database offering

Configure resources for scale and performance
• configure Azure SQL database/elastic pools for scale and performance
• configure Azure SQL managed instances for scale and performance
• configure SQL Server in Azure VMs for scale and performance
• calculate resource requirements
• evaluate database partitioning techniques, such as database sharding

Evaluate a strategy for moving to Azure
• evaluate requirements for the migration
• evaluate offline or online migration strategies
• evaluate requirements for the upgrade
• evaluate offline or online upgrade strategies

Implement a migration or upgrade strategy for moving to Azure
• implement an online migration strategy
• implement an offline migration strategy
• implement an online upgrade strategy
• implement an offline upgrade strategy

Implement a Secure Environment (15-20%)

Configure database authentication by using platform and database tools
• configure Azure AD authentication
• create users from Azure AD identities
• configure security principals

Configure database authorization by using platform and database tools
• configure database and object-level permissions using graphical tools
• apply principle of least privilege for all securables

Implement security for data at rest
• implement Transparent Data Encryption (TDE)
• implement object-level encryption
• implement Dynamic Data Masking
• implement Azure Key Vault and disk encryption for Azure VMs

Implement security for data in transit
• configure SQL DB and database-level firewall rules
• implement Always Encrypted
• configure Azure Data Gateway

Implement compliance controls for sensitive data
• apply a data classification strategy
• configure server and database audits
• implement data change tracking
• perform vulnerability assessment

Monitor and Optimize Operational Resources (15-20%)

Monitor activity and performance
• prepare an operational performance baseline
• determine sources for performance metrics
• interpret performance metrics
• assess database performance by using Azure SQL Database Intelligent Performance
• configure and monitor activity and performance at the infrastructure, server, service, and
database levels

Implement performance-related maintenance tasks
• implement index maintenance tasks
• implement statistics maintenance tasks
• configure database auto-tuning
• automate database maintenance tasks
o Azure SQL agent jobs, Azure automation, SQL server agent jobs

• manage storage capacity

Identify performance-related issues
• configure Query Store to collect performance data
• identify sessions that cause blocking
• assess growth/fragmentation of databases and logs
• assess performance-related database configuration parameters
o including AutoClose, AutoShrink, AutoGrowth

Configure resources for optimal performance
• configure storage and infrastructure resources
o optimize IOPS, throughput, and latency
o optimize tempdb performance
o optimize data and log files for performance

• configure server and service account settings for performance
• configure Resource Governor for performance

Configure a user database for optimal performance
• implement database-scoped configuration
• configure compute resources for scaling
• configure Intelligent Query Processing (IQP)


Optimize Query Performance (5-10%)

Review query plans

• determine the appropriate type of execution plan
o live Query Statistics, Actual Execution Plan, Estimated Execution Plan, Showplan



• identify problem areas in execution plans
• extract query plans from the Query Store


Evaluate performance improvements

• determine the appropriate Dynamic Management Views (DMVs) to gather query
performance information
• identify performance issues using DMVs
• identify and implement index changes for queries
• recommend query construct modifications based on resource usage
• assess the use of hints for query performance


Review database table and index design

• identify data quality issues with duplication of data
• identify normal form of database
• assess index design for performance
• validate data types defined for columns
• recommend table and index storage including filegroups
• evaluate table partitioning strategy
• evaluate the use of compression for tables and indexes


Perform Automation of Tasks (10-15%)

Create scheduled tasks

• manage schedules for regular maintenance jobs
• configure multi-server automation



• configure notifications for task success/failure/non-completion


Evaluate and implement an alert and notification strategy

• create event notifications based on metrics
• create event notifications for Azure resources
• create alerts for server configuration changes
• create tasks that respond to event notifications


Manage and automate tasks in Azure

• perform automated deployment methods for resources
• automate Backups
• automate performance tuning and patching
• implement policies by using automated evaluation modes


Plan and Implement a High Availability and Disaster Recovery (HADR)
Environment (15-20%)

Recommend an HADR strategy for a data platform solution

• recommend HADR strategy based on RPO/RTO requirements
• evaluate HADR for hybrid deployments
• evaluate Azure-specific HADR solutions
• identify resources for HADR solutions


Test an HADR strategy by using platform, OS and database tools

• test HA by using failover
• test DR by using failover or restore


Perform backup and restore a database by using database tools

• perform a database backup with options
• perform a database restore with options
• perform a database restore to a point in time
• configure long-term backup retention


Configure DR by using platform and database tools

• configure replication
• configure Azure Site Recovery for a database offering


Configure HA using platform, OS and database tools


• create an Availability Group
• integrate a database into an Availability Group
• configure quorum options for a Windows Server Failover Cluster
• configure an Availability Group listener


Perform Administration by Using T-SQL (10-15%)

Examine system health

• evaluate database health using DMVs
• evaluate server health using DMVs
• perform database consistency checks by using DBCC


Monitor database configuration by using T-SQL

• assess proper database autogrowth configuration
• report on database free space
• review database configuration options


Perform backup and restore a database by using T-SQL

• prepare databases for AlwaysOn Availability Groups
• perform transaction log backup
• perform restore of user databases
• perform database backups with options


Manage authentication by using T-SQL

• manage certificates
• manage security principals


Manage authorization by using T-SQL

• configure permissions for users to access database objects
• configure permissions by using custom roles

QUESTION 1
You have a Microsoft SQL Server 2019 instance in an on-premises datacenter. The instance contains a 4-TB database named DB1.
You plan to migrate DB1 to an Azure SQL Database managed instance.
What should you use to minimize downtime and data loss during the migration?

A. distributed availability groups
B. database mirroring
C. log shipping
D. Database Migration Assistant

Correct Answer: A

QUESTION 2
You have 20 Azure SQL databases provisioned by using the vCore purchasing model.
You plan to create an Azure SQL Database elastic pool and add the 20 databases.
Which three metrics should you use to size the elastic pool to meet the demands of your workload? Each
correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. total size of all the databases
B. geo-replication support
C. number of concurrently peaking databases * peak CPU utilization per database
D. maximum number of concurrent sessions for all the databases
E. total number of databases * average CPU utilization per database

Correct Answer: A,C,E

QUESTION 3
You have a new Azure SQL database. The database contains a column that stores confidential information.
You need to track each time values from the column are returned in a query. The tracking information must be
stored for 365 days from the date the query was executed.
Which three actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Turn on auditing and write audit logs to an Azure Storage account.
B. Add extended properties to the column.
C. Turn on Advanced Data Security for the Azure SQL server.
D. Apply sensitivity labels named Highly Confidential to the column.
E. Turn on Azure Advanced Threat Protection (ATP).

Correct Answer: A,C,D


Actualkey Microsoft Azure  DP-300 exam pdf, Certkingdom Microsoft Azure DP-300 PDF
MCTS Training, MCITP Trainnig

Tuesday, June 16, 2020

200-201 Understanding Cisco Cybersecurity Operations Fundamentals (CBROPS) Exam

Certification: Cisco Certified CyberOps Associate
Duration: 120 minutes (95 - 105 questions)
Available languages: English

Exam overview
This exam tests your knowledge and skills related to:
Security concepts
Security monitoring
Host-based analysis
Network intrusion analysis
Security policies and procedures

Exam Description
The Understanding Cisco Cybersecurity Operations Fundamentals (200-201 CBROPS) exam is a 120-minute assessment that is associated with the Cisco Certified CyberOps Associate certification. The CBROPS exam tests a candidate’s knowledge and skills related to security concepts, security monitoring, host-based analysis, network intrusion analysis, and security policies and procedures. The course, Understanding Cisco Cybersecurity Operations Fundamentals, helps candidates to prepare for this exam.

The following topics are general guidelines for the content likely to be included on the exam. However, other related topics may also appear on any specific delivery of the exam. To better reflect the contents of the exam and for clarity purposes, the guidelines below may change at any time without notice.

1.1 Describe the CIA triad

1.2 Compare security deployments

1.2.a Network, endpoint, and application security systems
1.2.b Agentless and agent-based protections
1.2.c Legacy antivirus and antimalware
1.2.d SIEM, SOAR, and log management

1.3 Describe security terms

1.3.a Threat intelligence (TI)
1.3.b Threat hunting
1.3.c Malware analysis
1.3.d Threat actor
1.3.e Run book automation (RBA)
1.3.f Reverse engineering
1.3.g Sliding window anomaly detection
1.3.h Principle of least privilege
1.3.i Zero trust
1.3.j Threat intelligence platform (TIP)

1.4 Compare security concepts

1.4.a Risk (risk scoring/risk weighting, risk reduction, risk assessment)
1.4.b Threat
1.4.c Vulnerability
1.4.d Exploit

1.5 Describe the principles of the defense-in-depth strategy

1.6 Compare access control models

1.6.a Discretionary access control
1.6.b Mandatory access control
1.6.c Nondiscretionary access control
1.6.d Authentication, authorization, accounting
1.6.e Rule-based access control
1.6.f Time-based access control
1.6.g Role-based access control

1.7 Describe terms as defined in CVSS

1.7.a Attack vector
1.7.b Attack complexity
1.7.c Privileges required
1.7.d User interaction
1.7.e Scope

1.8 Identify the challenges of data visibility (network, host, and cloud) in detection

1.9 Identify potential data loss from provided traffic profiles

1.10 Interpret the 5-tuple approach to isolate a compromised host in a grouped set of logs

1.11 Compare rule-based detection vs. behavioral and statistical detection

2.1 Compare attack surface and vulnerability

2.2 Identify the types of data provided by these technologies

2.2.a TCP dump
2.2.b NetFlow
2.2.c Next-gen firewall
2.2.d Traditional stateful firewall
2.2.e Application visibility and control
2.2.f Web content filtering
2.2.g Email content filtering

2.3 Describe the impact of these technologies on data visibility

2.3.a Access control list
2.3.b NAT/PAT
2.3.c Tunneling
2.3.d TOR
2.3.e Encryption
2.3.f P2P
2.3.g Encapsulation
2.3.h Load balancing

2.4 Describe the uses of these data types in security monitoring

2.4.a Full packet capture
2.4.b Session data
2.4.c Transaction data
2.4.d Statistical data
2.4.e Metadata
2.4.f Alert data

2.5 Describe network attacks, such as protocol-based, denial of service, distributed denial of service, and man-in-the-middle

2.6 Describe web application attacks, such as SQL injection, command injections, and cross-site scripting

2.7 Describe social engineering attacks

2.8 Describe endpoint-based attacks, such as buffer overflows, command and control (C2), malware, and ransomware

2.9 Describe evasion and obfuscation techniques, such as tunneling, encryption, and proxies

2.10 Describe the impact of certificates on security (includes PKI, public/private crossing the network, asymmetric/symmetric)

2.11 Identify the certificate components in a given scenario

2.11.a Cipher-suite
2.11.b X.509 certificates
2.11.c Key exchange
2.11.d Protocol version
2.11.e PKCS

3.1 Describe the functionality of these endpoint technologies in regard to security monitoring

3.1.a Host-based intrusion detection
3.1.b Antimalware and antivirus
3.1.c Host-based firewall
3.1.d Application-level whitelisting/blacklisting
3.1.e Systems-based sandboxing (such as Chrome, Java, Adobe Reader)

3.2 Identify components of an operating system (such as Windows and Linux) in a given scenario

3.3 Describe the role of attribution in an investigation

3.3.a Assets
3.3.b Threat actor
3.3.c Indicators of compromise
3.3.d Indicators of attack
3.3.e Chain of custody

3.4 Identify type of evidence used based on provided logs

3.4.a Best evidence
3.4.b Corroborative evidence
3.4.c Indirect evidence

3.5 Compare tampered and untampered disk image

3.6 Interpret operating system, application, or command line logs to identify an event

3.7 Interpret the output report of a malware analysis tool (such as a detonation chamber or sandbox)

3.7.a Hashes
3.7.b URLs
3.7.c Systems, events, and networking

4.1 Map the provided events to source technologies

4.1.a IDS/IPS
4.1.b Firewall
4.1.c Network application control
4.1.d Proxy logs
4.1.e Antivirus
4.1.f Transaction data (NetFlow)

4.2 Compare impact and no impact for these items

4.2.a False positive
4.2.b False negative
4.2.c True positive
4.2.d True negative
4.2.e Benign

4.3 Compare deep packet inspection with packet filtering and stateful firewall operation

4.4 Compare inline traffic interrogation and taps or traffic monitoring

4.5 Compare the characteristics of data obtained from taps or traffic monitoring and transactional data (NetFlow) in the analysis of network traffic

4.6 Extract files from a TCP stream when given a PCAP file and Wireshark

4.7 Identify key elements in an intrusion from a given PCAP file

4.7.a Source address
4.7.b Destination address
4.7.c Source port
4.7.d Destination port
4.7.e Protocols
4.7.f Payloads

4.8 Interpret the fields in protocol headers as related to intrusion analysis

4.8.a Ethernet frame
4.8.b IPv4
4.8.c IPv6
4.8.d TCP
4.8.e UDP
4.8.f ICMP
4.8.g DNS
4.8.h SMTP/POP3/IMAP
4.8.i HTTP/HTTPS/HTTP2
4.8.j ARP

4.9 Interpret common artifact elements from an event to identify an alert

4.9.a IP address (source / destination)
4.9.b Client and server port identity
4.9.c Process (file or registry)
4.9.d System (API calls)
4.9.e Hashes
4.9.f URI / URL

4.10 Interpret basic regular expressions

5.1 Describe management concepts

5.1.a Asset management
5.1.b Configuration management
5.1.c Mobile device management
5.1.d Patch management
5.1.e Vulnerability management

5.2 Describe the elements in an incident response plan as stated in NIST.SP800-61

5.3 Apply the incident handling process (such as NIST.SP800-61) to an event

5.4 Map elements to these steps of analysis based on the NIST.SP800-61

5.4.a Preparation
5.4.b Detection and analysis
5.4.c Containment, eradication, and recovery
5.4.d Post-incident analysis (lessons learned)

5.5 Map the organization stakeholders against the NIST IR categories (CMMC, NIST.SP800-61)

5.5.a Preparation
5.5.b Detection and analysis
5.5.c Containment, eradication, and recovery
5.5.d Post-incident analysis (lessons learned)

5.6 Describe concepts as documented in NIST.SP800-86

5.6.a Evidence collection order
5.6.b Data integrity
5.6.c Data preservation
5.6.d Volatile data collection

5.7 Identify these elements used for network profiling

5.7.a Total throughput
5.7.b Session duration
5.7.c Ports used
5.7.d Critical asset address space

5.8 Identify these elements used for server profiling

5.8.a Listening ports
5.8.b Logged in users/service accounts
5.8.c Running processes
5.8.d Running tasks
5.8.e Applications

5.9 Identify protected data in a network

5.9.a PII
5.9.b PSI
5.9.c PHI
5.9.d Intellectual property

5.10 Classify intrusion events into categories as defined by security models, such as Cyber Kill Chain Model and Diamond Model of Intrusion

5.11 Describe the relationship of SOC metrics to scope analysis (time to detect, time to contain, time to respond, time to control)


QUESTION 1
Which type of algorithm encrypts data bit by bit?

A. block
B. asymmetric
C. stream
D. symmetric

Correct Answer: C

QUESTION 2
Which of the following is deployed on an endpoint as an agent or standalone application?

A. NIPS
B. NGFW
C. HIDS
D. NIDS

Correct Answer: C

QUESTION 3
Which of the following represents an exploitable, unpatched, and unmitigated weakness in software?

A. vulnerability
B. exploit
C. threat
D. breach

Correct Answer: A

QUESTION 4
Which of the following describes a TCP injection attack?

A. Many TCP SYN packets are captures with the same sequence number, source, and destination IP address, but different payloads.
B. there is an abnormally high volume of scanning from numerous sources
C. many TCP SYN packets are captured with the same sequence number, but different source and destination IP addresses and different payloads
D. an attacker performs actions slower than normal

Correct Answer: A

QUESTION 5
How are attributes of ownership and control of an object managed in Linux?

A. permissions
B. rights
C. iptables
D. processes

Correct Answer: A


Actualkey Cisco Certified CyberOps Associate 200-201 exam pdf, Certkingdom Cisco Certified CyberOps Associate 200-201 PDF
MCTS Training, MCITP Trainnig
Best Cisco Certified CyberOps Associate 200-201 Certification, Cisco Certified CyberOps Associate 200-201 Training at certkingdom.com