A company owns manufacturing facilities with Internet of Things (IoT) devices installed to monitor safety data.
The company has configured an Amazon Kinesis data stream as a source for an Amazon Kinesis Data Firehose delivery stream, which outputs data to Amazon S3. The company's operations team wants to gain insights from the IoT data to monitor data quality at ingestion. The insights need to be derived in near-real time, and the output must be logged to Amazon DynamoDB for further analysis.
Which solution meets these requirements?
A. Create an Amazon Kinesis Data Analytics for SQL application to read and analyze the data in the data stream. Add an output configuration so that everything written to an in-application stream persists in a DynamoDB table.
B. Create an Amazon Kinesis Data Analytics for SQL application to read and analyze the data in the data stream. Add an output configuration so that everything written to an in-application stream is passed to an AWS Lambda function that saves the data in a DynamoDB table as persistent data.
C. Configure an AWS Lambda function to analyze the data in the Kinesis Data Firehose delivery stream. Save the output to a DynamoDB table.
D. Configure an AWS Lambda function to analyze the data in the Kinesis Data Firehose delivery stream and save the output to an S3 bucket. Schedule an AWS Glue job to periodically copy the data from the bucket to a DynamoDB table.
The company has configured an Amazon Kinesis data stream as a source for an Amazon Kinesis Data Firehose delivery stream, which outputs data to Amazon S3. The company's operations team wants to gain insights from the IoT data to monitor data quality at ingestion. The insights need to be derived in near-real time, and the output must be logged to Amazon DynamoDB for further analysis.
Which solution meets these requirements?
A. Create an Amazon Kinesis Data Analytics for SQL application to read and analyze the data in the data stream. Add an output configuration so that everything written to an in-application stream persists in a DynamoDB table.
B. Create an Amazon Kinesis Data Analytics for SQL application to read and analyze the data in the data stream. Add an output configuration so that everything written to an in-application stream is passed to an AWS Lambda function that saves the data in a DynamoDB table as persistent data.
C. Configure an AWS Lambda function to analyze the data in the Kinesis Data Firehose delivery stream. Save the output to a DynamoDB table.
D. Configure an AWS Lambda function to analyze the data in the Kinesis Data Firehose delivery stream and save the output to an S3 bucket. Schedule an AWS Glue job to periodically copy the data from the bucket to a DynamoDB table.