AWS Certified Solutions Architect – Professional SAP-C01 – Question219

Identify an application that polls AWS Data Pipeline for tasks and then performs those tasks.

A.
A task executor
B. A task deployer
C. A task runner
D. A task optimizer

Correct Answer: C

Explanation:

Explanation: A task runner is an application that polls AWS Data Pipeline for tasks and then performs those tasks. You can either use Task Runner as provided by AWS Data Pipeline, or create a custom Task Runner application. Task Runner is a default implementation of a task runner that is provided by AWS Data Pipeline. When Task Runner is installed and configured, it polls AWS Data Pipeline for tasks associated with pipelines that you have activated. When a task is assigned to Task Runner, it performs that task and reports its status back to AWS Data Pipeline. If your workflow requires non-default behavior, you’ll need to implement that functionality in a custom task runner.
Reference:
http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-ho…

AWS Certified Solutions Architect – Professional SAP-C01 – Question218

AWS Direct Connect itself has NO specific resources for you to control access to. Therefore, there are no AWS Direct Connect Amazon Resource Names (ARNs) for you to use in an Identity and Access Management (IAM) policy.
With that in mind, how is it possible to write a policy to control access to AWS Direct Connect actions?

A.
You can leave the resource name field blank.
B. You can choose the name of the AWS Direct Connection as the resource.
C. You can use an asterisk (*) as the resource.
D. You can create a name for the resource.

Correct Answer: C

Explanation:

Explanation: AWS Direct Connect itself has no specific resources for you to control access to. Therefore, there are no AWS Direct Connect ARNs for you to use in an IAM policy. You use an asterisk (*) as the resource when writing a policy to control access to AWS Direct Connect actions.
Reference:
http://docs.aws.amazon.com/directconnect/latest/UserGuide/using_iam…

AWS Certified Solutions Architect – Professional SAP-C01 – Question217

Which of the following cannot be done using AWS Data Pipeline?

A.
Create complex data processing workloads that are fault tolerant, repeatable, and highly available.
B. Regularly access your data where it's stored, transform and process it at scale, and efficiently transfer the results to another AWS service.
C. Generate reports over data that has been stored.
D. Move data between different AWS compute and storage services as well as on premise data sources at specified intervals.

Correct Answer: C

Explanation:

Explanation: AWS Data Pipeline is a web service that helps you reliably process and move data between different AWS compute and storage services as well as on premise data sources at specified intervals. With AWS Data Pipeline, you can regularly access your data where it’s stored, transform and process it at scale, and efficiently transfer the results to another AWS. AWS Data Pipeline helps you easily create complex data processing workloads that are fault tolerant, repeatable, and highly available. AWS Data Pipeline also allows you to move and process data that was previously locked up in on premise data silos.
Reference:
http://aws.amazon.com/datapipeline/

AWS Certified Solutions Architect – Professional SAP-C01 – Question216

In Amazon RDS for PostgreSQL, you can provision up to 3TB storage and 30,000 IOPS per database instance. For a workload with 50% writes and 50% reads running on a cr1.8xlarge instance, you can realize over 25,000 IOPS for PostgreSQL. However, by provisioning more than this limit, you may be able to achieve:

A.
higher latency and lower throughput.
B. lower latency and higher throughput.
C. higher throughput only.
D. higher latency only.

Correct Answer: B

Explanation:

Explanation: You can provision up to 3TB storage and 30,000 IOPS per database instance. For a workload with 50% writes and 50% reads running on a cr1.8xlarge instance, you can realize over 25,000 IOPS for PostgreSQL. However, by provisioning more than this limit, you may be able to achieve lower latency and higher throughput. Your actual realized IOPS may vary from the amount you provisioned based on your database workload, instance type, and database engine choice.
Reference:
https://aws.amazon.com/rds/postgresql/

AWS Certified Solutions Architect – Professional SAP-C01 – Question215

Select the correct statement about Amazon ElastiCache.

A.
It makes it easy to set up, manage, and scale a distributed in-memory cache environment in the cloud.
B. It allows you to quickly deploy your cache environment only if you install software.
C. It does not integrate with other Amazon Web Services.
D. It cannot run in the Amazon Virtual Private Cloud (Amazon VPC) environment.

Correct Answer: A

Explanation:

Explanation: ElastiCache is a web service that makes it easy to set up, manage, and scale a distributed in memory cache environment in the cloud. It provides a high-performance, scalable, and cost- effective caching solution, while removing the complexity associated with deploying and managing a distributed cache environment. With ElastiCache, you can quickly deploy your cache environment, without having to provision hardware or install software.
Reference:
http://docs.aws.amazon.com/AmazonElastiCache/latest/UserGuide/WhatI…

AWS Certified Solutions Architect – Professional SAP-C01 – Question214

Attempts, one of the three types of items associated with the schedule pipeline in the AWS Data Pipeline, provides robust data management.
Which of the following statements is NOT true about Attempts?

A.
Attempts provide robust data management.
B. AWS Data Pipeline retries a failed operation until the count of retries reaches the maximum number of allowed retry attempts.
C. An AWS Data Pipeline Attempt object compiles the pipeline components to create a set of actionable instances.
D. AWS Data Pipeline Attempt objects track the various attempts, results, and failure reasons if applicable.

Correct Answer: C

Explanation:

Explanation: Attempts, one of the three types of items associated with a schedule pipeline in AWS Data Pipeline, provides robust data management. AWS Data Pipeline retries a failed operation. It continues to do so until the task reaches the maximum number of allowed retry attempts. Attempt objects track the various attempts, results, and failure reasons if applicable. Essentially, it is the instance with a counter. AWS Data Pipeline performs retries using the same resources from the previous attempts, such as Amazon EMR clusters and EC2 instances.
Reference:
http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-ho…

AWS Certified Solutions Architect – Professional SAP-C01 – Question213

When using string conditions within IAM, short versions of the available comparators can be used instead of the more verbose ones. streqi is the short version of the _______ string condition.

A.
StringEqualsIgnoreCase
B. StringNotEqualsIgnoreCase
C. StringLikeStringEquals
D. StringNotEquals

Correct Answer: A

Explanation:

Explanation: When using string conditions within IAM, short versions of the available comparators can be used instead of the more verbose versions. For instance, streqi is the short version of StringEqualsIgnoreCase that checks for the exact match between two strings ignoring their case.
Reference:
http://awsdocs.s3.amazonaws.com/SNS/20100331/sns-gsg-2010-03-31.pdf

AWS Certified Solutions Architect – Professional SAP-C01 – Question212

Which of the following is true while using an IAM role to grant permissions to applications running on Amazon EC2 instances?

A.
All applications on the instance share the same role, but different permissions.
B. All applications on the instance share multiple roles and permissions.
C. Multiple roles are assigned to an EC2 instance at a time.
D. Only one role can be assigned to an EC2 instance at a time.

Correct Answer: D

Explanation:

Explanation: Only one role can be assigned to an EC2 instance at a time, and all applications on the instance share the same role and permissions.
Reference:
http://docs.aws.amazon.com/IAM/latest/UserGuide/role-usecase-ec2app…

AWS Certified Solutions Architect – Professional SAP-C01 – Question211

In the context of policies and permissions in AWS IAM, the Condition element is ____________.

A.
crucial while writing the IAM policies
B. an optional element
C. always set to null
D. a mandatory element

Correct Answer: B

Explanation:

Explanation: The Condition element (or Condition block) lets you specify conditions for when a policy is in effect. The Condition element is optional.
Reference:
http://docs.aws.amazon.com/IAM/latest/UserGuide/AccessPolicyLanguag…

AWS Certified Solutions Architect – Professional SAP-C01 – Question210

Which of the following is true of an instance profile when an IAM role is created using the console?

A.
The instance profile uses a different name.
B. The console gives the instance profile the same name as the role it corresponds to.
C. The instance profile should be created manually by a user.
D. The console creates the role and instance profile as separate actions.

Correct Answer: B

Explanation:

Explanation: Amazon EC2 uses an instance profile as a container for an IAM role. When you create an IAM role using the console, the console creates an instance profile automatically and gives it the same name as the role it corresponds to. If you use the AWS CLI, API, or an AWS SDK to create a role, you create the role and instance profile as separate actions, and you might give them different names.
Reference:
http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch…