AWS Certified Data Analytics – Specialty DAS-C01 – Question137

An educational technology company is running an online assessment application that allows thousands of students to concurrently take assessments on the company's platform. The application uses a combination of relational databases running on an Amazon Aurora PostgreSQL DB cluster and Amazon DynamoDB tables for storing data. Users reported issues with application performance during a recent large-scale online assessment. As a result, the company wants to design a solution that captures metrics from all databases in a centralized location and queries the metrics to identify issues with performance.
How can this solution be designed with the LEAST operational overhead?

A.
Configure AWS Database Migration Service (AWS DMS) to copy the database logs to an Amazon S3 bucket. Schedule an AWS Glue crawler to periodically populate an AWS Glue table. Query the AWS Glue table with Amazon Athena.
B. Configure an Amazon CloudWatch metric stream with an Amazon Kinesis Firehose delivery stream destination that stores the data in an Amazon S3 bucket. Schedule an AWS Glue crawler to periodically populate an AWS Glue table. Query the AWS Glue table with Amazon Athena.
C. Create an Apache Kafka cluster on Amazon EC2. Configure a Java Database Connectivity (JDBC) connector for Kafka Connect on each database to capture and stream the logs to a single Amazon CloudWatch log group. Query the CloudWatch log group with Amazon Athena.
D. Install a server on Amazon EC2 to capture logs from Amazon RDS and DynamoDB by using Java Database Connectivity (JDBC) connectors. Stream the logs to an Amazon Kinesis Data Firehose delivery stream that stores the data in an Amazon S3 bucket. Query the output logs in the S3 bucket by using Amazon Athena.

Correct Answer: B