Summer Special - Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: top65certs

AWS Certified AI Practitioner AIF-C01 Updated Exam

Page: 6 / 13
Total 177 questions

AWS Certified AI Practitioner Exam Questions and Answers

Question 21

A company's large language model (LLM) is experiencing hallucinations.

How can the company decrease hallucinations?

Options:

A.

Set up Agents for Amazon Bedrock to supervise the model training.

B.

Use data pre-processing and remove any data that causes hallucinations.

C.

Decrease the temperature inference parameter for the model.

D.

Use a foundation model (FM) that is trained to not hallucinate.

Question 22

A company needs to log all requests made to its Amazon Bedrock API. The company must retain the logs securely for 5 years at the lowest possible cost.

Which combination of AWS service and storage class meets these requirements? (Select TWO.)

Options:

A.

AWS CloudTrail

B.

Amazon CloudWatch

C.

AWS Audit Manager

D.

Amazon S3 Intelligent-Tiering

E.

Amazon S3 Standard

Question 23

A bank has fine-tuned a large language model (LLM) to expedite the loan approval process. During an external audit of the model, the company discovered that the model was approving loans at a faster pace for a specific demographic than for other demographics.

How should the bank fix this issue MOST cost-effectively?

Options:

A.

Include more diverse training data. Fine-tune the model again by using the new data.

B.

Use Retrieval Augmented Generation (RAG) with the fine-tuned model.

C.

Use AWS Trusted Advisor checks to eliminate bias.

D.

Pre-train a new LLM with more diverse training data.

Question 24

A company wants to use a large language model (LLM) to develop a conversational agent. The company needs to prevent the LLM from being manipulated with common prompt engineering techniques to perform undesirable actions or expose sensitive information.

Which action will reduce these risks?

Options:

A.

Create a prompt template that teaches the LLM to detect attack patterns.

B.

Increase the temperature parameter on invocation requests to the LLM.

C.

Avoid using LLMs that are not listed in Amazon SageMaker.

D.

Decrease the number of input tokens on invocations of the LLM.

Page: 6 / 13
Total 177 questions