New Year Special 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: save70

Vce SAA-C02 Questions Latest

Page: 21 / 27
Total 1 questions

AWS Certified Solutions Architect - Associate (SAA-C03) Questions and Answers

Question 81

A company hosts more than 300 global websites and applications. The company requires a platform to analyze more than 30 TB of clickstream data each day.

What should a solutions architect do to transmit and process the clickstream data?

Options:

A.

Design an AWS Data Pipeline to archive the data to an Amazon S3 bucket and run an Amazon EMR duster with the data to generate analytics

B.

Create an Auto Scaling group of Amazon EC2 instances to process the data and send it to an Amazon S3 data lake for Amazon Redshift to use tor analysis

C.

Cache the data to Amazon CloudFron: Store the data in an Amazon S3 bucket When an object is added to the S3 bucket, run an AWS Lambda function to process the data tor analysis.

D.

Collect the data from Amazon Kinesis Data Streams. Use Amazon Kinesis Data Firehose to transmit the data to an Amazon S3 data lake Load the data in Amazon Redshift for analysis

Question 82

A startup company is using me AWS Cloud to develop a traffic control monitoring system for a large city The system must be highly available and must provide near-real-time results for residents and city officials even during peak events

Gigabytes of data will come in daily from loT devices that run at intersections and freeway ramps across the city The system must process the data sequentially to provide the correct timeline However results need to show only what has happened in the last 24 hours.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Deploy Amazon Kinesis Data Firehose to accept incoming data from the loT devices and write the data to Amazon S3 Build a web dashboard to display the data from the last 24 hours

B.

Deploy an Amazon API Gateway API endpoint and an AWS Lambda function to process incoming data from the loT devices and store the data in Amazon DynamoDB Build a web dashboard to display the data from the last 24 hours

C.

Deploy an Amazon API Gateway API endpoint and an Amazon Simple Notification Service (Amazon SNS) tope to process incoming data from the loT devices Write the data to Amazon Redshift Build a web dashboard to display the data from the last 24 hours

D.

Deploy an Amazon Simple Queue Service (Amazon SOS) FIFO queue and an AWS Lambda function to process incoming data from the loT devices and store the data in an Amazon RDS DB instance Build a web dashboard to display the data from the last 24 hours

Question 83

A disaster relief company is designing a new solution to analyze real-time csv data. The data is collected by a network of thousands of research stations met are distributed across the world. The data volume is consistent and constant, and the size of each data We is 512 KB. The company needs to stream the data and analyze the data in real time.

Which combination of actions should a solutions architect take to meet these requirements? (Select TWO.)

Options:

A.

Provision an appropriately sized Amazon Simple Queue Service (Amazon SOS) queue. Use the AWS SDK at the research stations to write the data into the SOS queue

B.

Provision an appropriately sized Amazon Kinesis Data Firehose delivery stream. Use the AWS SDK at the research stations to write the data into the delivery stream and then into an Amazon S3 bucket.

C.

Provision an appropriately sized Amazon Kinesis Data Analytics application. Use the AWS CLI to configure Kinesis Data Analytics with SOL queries

D.

Provision an AWS Lambda function to process the data. Set up the BatchSize property on the Lambda event source.

E.

Provision an AWS Lambda function to process the data. Set up an Amazon EventBridge (Amazon CloudWatch Events) cron expression rule to invoke the Lambda function

Question 84

A solutions architect is creating an application that will handle batch processing of large amounts of data The input data will be held in Amazon S3 and the output data will be stored in a different S3 bucket For processing, the application will transfer the data over the network between multiple Amazon EC2 instances

What should the solutions architect do to reduce the overall data transfer costs?

Options:

A.

Place ail the EC2 instances in an Auto Scaling group

B.

Place all the EC2 instances in the same AWS Region

C.

Place ail the EC2 instances in the same Availability Zone

D.

Place all the EC2 Instances in private subnets in multiple Availability Zones

Page: 21 / 27
Total 1 questions