AWS SA Associate Practice Questions – 21

Question 1:

What statement correctly describes CloudWatch operation within AWS

cloud?

A. log data is stored indefinitely

B. log data is stored for 15 days

C. alarm history is never deleted

D. ELB is not supported

Answer (A)

Question 2:

What are two AWS subscriber endpoint services that are supported with SNS?

A. RDS

B. Kinesis

C. SQS

D. Lambda

E. EBS

F. ECS

Answer (C,D)

Question 3:

What AWS services work in concert to integrate security monitoring and

audit within a VPC? (Select three)

A. Syslog

B. CloudWatch

C. WAF

D. CloudTrail

E. VPC Flow Log

Answer (B,D,E)

Question 4:

How is CloudWatch integrated with Lambda? (Select two)

A. tenant must enable CloudWatch monitoring

B. network metrics such as latency are not monitored

C. Lambda functions are automatically monitored through Lambda

service

D. log group is created for each event source

E. log group is created for each function

Answer (C,E)

Question 5:

What two statements correctly describe AWS monitoring and audit

operations?

A. CloudTrail captures API calls, stores them in an S3 bucket and

generates a Cloudwatch event

B. CloudWatch alarm can send a message to a Lambda function

C. CloudWatch alarm can send a message to an SNS Topic that triggers

an event for a Lambda function

D. CloudTrail captures all AWS events and stores them in a log file

E. VPC logs do not support events for security groups

Answer (A,C)

Question 6:

What is required for remote management access to your Linux-based

instance?

A. ACL

B. Telnet

C. SSH

D. RDP

Answer (C)

Question 7:

What are two features of CloudWatch operation?

A. CloudWatch does not support custom metrics

B. CloudWatch permissions are granted per feature and not AWS

resource

C. collect and monitor operating system and application generated log

files

D. AWS services automatically create logs for CloudWatch

E. CloudTrail generates logs automatically when AWS account is

activated

Answer (B,C)

Question 8:

You are asked to select an AWS solution that will create a log entry anytime a

snapshot of an RDS database instance and deletes the original instance. Select

the AWS service that would provide that feature?

A. VPC Flow Logs

B. RDS Access Logs

C. CloudWatch

D. CloudTrail

Answer (D)

Question 9:

What is required to enable application and operating system generated logs

and publish to CloudWatch Logs?

A. Syslog

B. enable access logs

C. IAM cross-account enabled

D. CloudWatch Log Agent

Answer (D)

Question 10:

What is the purpose of VPC Flow Logs?

A. capture VPC error messages

B. capture IP traffic on network interfaces

C. monitor network performance

D. monitor netflow data from subnets

E. enable Syslog services for VPC

Answer (B)

Question 11:

Select two cloud infrastructure services and/or components included with

default CloudWatch monitoring?

A. SQS queues

B. operating system metrics

C. hypervisor metrics

D. virtual appliances

E. application level metrics

Answer (A,C)

Question 12:

What feature enables CloudWatch to manage capacity dynamically for EC2

instances?

A. replication lag

B. Auto-Scaling

C. Elastic Load Balancer

D. vertical scaling

Answer (B)

Question 13:

What AWS service is used to monitor tenant remote access and various

security errors including authentication retries?

A. SSH

B. Telnet

C. CloudFront

D. CloudWatch

Answer (D)

Question 14:

How does Amazon AWS isolate metrics from different applications for

monitoring, store and reporting purposes?

A. EC2 instances

B. Beanstalk

C. CloudTrail

D. namespaces

E. Docker

Answer (D)

Question 15:

What Amazon AWS service provides account transaction monitoring and

security audit?

A. CloudFront

B. CloudTrail

C. CloudWatch

D. security group

Answer (B)

Question 16:

What two statements correctly describe CloudWatch monitoring of database

instances?

A. metrics are sent automatically from DynamoDB and RDS to

CloudWatch

B. alarms must be configured for DynamoDB and RDS within

CloudWatch

C. metrics are not enabled automatically for DynamoDB and RDS

D. RDS does not support monitoring of operating system metrics

Answer (A,B)

Question 17:

What AWS service can send notifications to customer smartphones and

mobile applications with attached video and/or alerts?

A. EMR

B. Lambda

C. SQS

D. SNS

E. CloudTrail

Answer (D)

Question 18

A company called Acmeshell has a backup policy stating that backups need to be easily available for 6 months and then be sent to long term archiving. How would Acmeshell can use S3 to accomplish this goal?

1. Write an AWS command line tool to backup the data and send it to glacier after 6 months

2. Use S3 bucket policies to manage the data

3. This is automatically handled by AWS

4. Use bucket Lifecycle policies and set the files to go to glacier storage after 6 month

Ans: 4

Exp: Lifecycle management defines how Amazon S3 manages objects during their lifetime. Some objects that you store in an Amazon S3 bucket might have a well defined lifecycle:

If you are uploading periodic logs to your bucket, your application might need these logs for a week or a month after creation, and after that you might want to delete them.

Some documents are frequently accessed for a limited period of time. After that, you might not need real time access to these objects, but your organization might require you to archive them for a longer period and then optionally delete them later. Digital media archives, financial and healthcare records, raw genomics sequence data, long term database backups, and data that must be retained for regulatory compliance are some kinds of data that you might upload to Amazon S3 primarily for archival purposes.

For such objects, you can define rules that identify the affected objects, a timeline, and specific actions you want Amazon S3 to perform on the objects.

Amazon S3 manages object lifetimes with a lifecycle configuration, which is assigned to a bucket and defines rules for individual objects. Each rule in a lifecycle configuration consists of the following:

An object key prefix that identifies one or more objects to which the rule applies.

An action or actions that you want Amazon S3 to perform on the specified objects.

A date or a time period, specified in days since object creation, when you want Amazon S3 to perform the specified action.

You can add these rules to your bucket using either the Amazon S3 console or programmatically.

Question 19

You are going to create an Amazon Relational Database Services (RDS) for your production applications, and for that you require fast and consistent I/O performance. is it the right choice to use Provisioned IOPS on RDS instances which are launched under a VPC ?

1. No, Provisioned IOPS are not for RDS

2. Yes, Provisioned IOPS can be used for all RDS instances

3. Yes, Provisioned IOPS can be used, but with MySQL based instances

4. Yes, Provisioned IOPS can be used, but with Oracle based instances

Ans: 4

Exp: For any production application that requires fast and consistent I/O performance, we recommend Provisioned IOPS (input/output operations per second) storage. Provisioned IOPS storage is a storage option that delivers fast, predictable,and consistent throughput performance. When you create a DB instance, you specify an IOPS rate and storage space allocation. Amazon RDS provisions that IOPS rate and storage for the lifetime of the DB instance or until you change it.

Provisioned IOPS storage is optimized for I/O intensive, online transaction processing (OLTP) workloads that have consistent performance requirements.

A virtual private cloud is a virtual network that is logically isolated from other virtual networks in the AWS cloud. Amazon Virtual Private Cloud (VPC) lets you launch AWS resources, such as an Amazon RDS or Amazon EC2 instance, into a VPC. The VPC can either be a default VPC that comes with your account or it could be one that you create. All VPCs are associated with your AWS account.

IOPS is available for all your RDS instances it is not depend on which database engine or their deployment strategy (either inside VPC or outside VPC).You can provision a MySQL, PostgreSQL, or Oracle DB instance with up to 30,000 IOPS and 3 TB of allocated storage.You can provision a SQL Server DB instance with up to 10,000 IOPS and 1 TB of allocated storage.

Question20

When an object is lost under RRS, then which event is triggered?

1. ReducedRedundancyLostObject

2. NotifyReducedRedundancyLostObject

3. ReducedRedundancyLostObjectNotify

4. ReducedRedundancyLostObjectNotification

Ans: 1

Exp: PUT Bucket notification

This implementation of the PUT operation uses the notification subresource to enable notifications of specified events for a bucket. Currently, the s3:ReducedRedundancyLostObject event is the only event supported for notifications. The s3:ReducedRedundancyLostObject event is triggered when Amazon S3 detects that it has lost all replicas of an object and can no longer service requests for that object.

If the bucket owner and Amazon SNS topic owner are the same, the bucket owner has permission to publish notifications to the topic by default. Otherwise, the owner of the topic must create a policy to enable the bucket owner to publish to the topic. For more information about creating this policy, go to Example Cases for Amazon SNS Access Control.

By default, only the bucket owner can configure notifications on a bucket. However, bucket owners can use a bucket policy to grant permission to other users to set this configuration with s3:PutBucketNotification permission.

After you call the PUT operation to configure notifications on a bucket, Amazon S3 publishes a test notification to ensure that the topic exists and that the bucket owner has permission to publish to the specified topic. If the notification is successfully published to the SNS topic, the PUT operation updates the bucket configuration and returns the 200 OK response with a x-amz-sns-test-message-id header containing the message ID of the test notification sent to topic.

To turn off notifications on a bucket, you specify an empty NotificationConfiguration element in your request

Question21

How do you define the "Activity Worker" within the context of Amazon Simple Workflow Service?

1. It is a code piece or program which decide logic

2. Code piece or program where you can run custom garbage collection within the Workflow

3. An activity worker is a program (piece of code that implements tasks) that receives activity tasks, performs them, and provides results back

4. It is an individual task done by the workflow

Ans: 3

Exp: The fundamental concept in Amazon SWF is the workflow. A workflow is a set of activities that carry out some objective, together with logic that coordinates the activities. For example, a workflow could receive a customer order and take whatever actions are necessary to fulfil it.

Each workflow runs in an AWS resource called a domain, which controls the workflow's scope.

An AWS account can have multiple domains, each of which can contain multiple workflows, but workflows in different domains cannot interact.

When designing an Amazon SWF workflow, you precisely define each of the required activities. You then register each activity with Amazon SWF as an activity type. When you register the activity, you provide information such as a name and version, and some timeout values based on how long you expect the activity to take. For example, a customer may have an expectation that an order will ship within 24 hours. Such expectations would inform the timeout values that you specify when registering your activities.

In the process of carrying out the workflow, some activities may need to be performed more than once, perhaps with varying inputs. For example, in a customer-order workflow, you might have an activity that handles purchased items. If the customer purchases multiple items, then this activity would have to run multiple times. Amazon SWF has the concept of an activity task that represents one invocation of an activity. In our example, the processing of each item would be represented by a single activity task.

An activity worker is a program that receives activity tasks, performs them, and provides results back. Note that the task itself might actually be performed by a person, in which case the person would use the activity worker software for the receipt and disposition of the task. An example might be a statistical analyst, who receives sets of data, analyzes them, and then sends back the analysis.

Question22

You are an Amazon Web Service Solution architect, and in your organization you have multiple processes run asynchronously, however they have some dependencies on each other and It requires that you coordinate the execution of multiple distributed components and deal with the increased latencies and unreliability inherent in remote communication. So which of the following solutions perfectly fit to handle this scenario?

1. You will implement this with the help of message queues and databases, along with the logic to synchronize them.

2. You will use Amazon Simple Workflow (SWF)

3. You will implement this using Amazon Simple Queue Service (SQS)

4. You will solve this problem using Amazon Simple Notification Service (Amazon SNS)

Ans: 2

Exp: Amazon Simple Workflow (Amazon SWF) is a task coordination and state management service for cloud applications. With Amazon SWF, you can stop writing complex glue-code and state machinery and invest more in the business logic that makes your applications unique. The Amazon Simple Workflow Service (Amazon SWF) makes it easier to develop asynchronous and distributed applications by providing a programming model and infrastructure for coordinating distributed components and maintaining their execution state in a reliable way. By relying on Amazon SWF, you are freed to focus on building the aspects of your application that differentiate it.

©2019 by Raghavendra Kambhampati