Winter Special - Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: top65certs

Free and Premium Oracle 1z0-1127-24 Dumps Questions Answers

Page: 1 / 3
Total 40 questions

Oracle Cloud Infrastructure 2024 Generative AI Professional Questions and Answers

Question 1

What does "k-shot prompting* refer to when using Large Language Models for task-specific applications?

Options:

A.

Limiting the model to only k possible outcomes or answers for a given task

B.

The process of training the model on k different tasks simultaneously to improve its versatility

C.

Explicitly providing k examples of the intended task in the prompt to guide the models output

D.

Providing the exact k words in the prompt to guide the model’s response

Buy Now
Question 2

What distinguishes the Cohere Embed v3 model from its predecessor in the OCI Generative AI service?

Options:

A.

Improved retrievals for Retrieval Augmented Generation (RAG) systems

B.

Capacity to translate text in over u languages

C.

Support for tokenizing longer sentences

D.

Emphasis on syntactic clustering of word embedding’s

Question 3

Which is NOT a category of pertained foundational models available in the OCI Generative AI service?

Options:

A.

Translation models

B.

Summarization models

C.

Generation models

D.

Embedding models

Question 4

Why is normalization of vectors important before indexing in a hybrid search system?

Options:

A.

It converts all sparse vectors to dense vectors.

B.

It significantly reduces the size of the database.

C.

It standardizes vector lengths for meaningful comparison using metrics such as Cosine Similarity.

D.

It ensures that all vectors represent keywords only.

Question 5

Which statement best describes the role of encoder and decoder models in natural language processing?

Options:

A.

Encoder models and decoder models both convert sequence* of words into vector representations without generating new text.

B.

Encoder models are used only for numerical calculations, whereas decoder models are used to interpret the calculated numerical values back into text.

C.

Encoder models take a sequence of words and predict the next word in the sequence, whereas decoder models convert a sequence of words into a numerical representation.

D.

Encoder models convert a sequence of words into a vector representation, and decoder models take this vector representation to sequence of words.

Question 6

In LangChain, which retriever search type is used to balance between relevancy and diversity?

Options:

A.

top k

B.

mmr

C.

similarity_score_threshold

D.

similarity

Question 7

Which statement is true about the "Top p" parameter of the OCI Generative AI Generation models?

Options:

A.

Top p assigns penalties to frequently occurring tokens.

B.

Top p determines the maximum number of tokens per response.

C.

Top p limits token selection based on the sum of their probabilities.

D.

Top p selects tokens from the “Top k’ tokens sorted by probability.

Question 8

Which is a cost-related benefit of using vector databases with Large Language Models (LLMs)?

Options:

A.

They require frequent manual updates, which increase operational costs.

B.

They offer real-time updated knowledge bases and are cheaper than fine-tuned LLMs.

C.

They increase the cost due to the need for real- time updates.

D.

They are more expensive but provide higher quality data.

Question 9

Given the following prompts used with a Large Language Model, classify each as employing theChain-of- Thought, Least-to-most, or Step-Back prompting technique.

L Calculate the total number of wheels needed for 3 cars. Cars have 4 wheels each. Then, use the total number of wheels to determine how many sets of wheels we can buy with $200 if one set (4 wheels) costs $50.

2. Solve a complex math problem by first identifying the formula needed, and then solve a simpler version of the problem before tackling the full question.

3. To understand the impact of greenhouse gases on climate change, let's start by defining what greenhouse gases are. Next, well explore how they trap heat in the Earths atmosphere.

Options:

A.

1:Step-Back, 2:Chain-of-Thought, 3:Least-to-most

B.

1:Least-to-most, 2 Chain-of-Thought, 3:Step-Back

C.

1:Chain-of-Thought ,2:Step-Back, 3:Least-to most

D.

1:Chain-of-throught, 2: Least-to-most, 3:Step-Back

Question 10

What is the purpose of the "stop sequence" parameter in the OCI Generative AI Generation models?

Options:

A.

It com rob the randomness of the model* output, affecting its creativity.

B It specifies a string that tells the model to stop generating more content

B.

It assigns a penalty to frequently occurring tokens to reduce repetitive text.

C.

It determines the maximum number of tokens the model can generate per response.

Question 11

How does the architecture of dedicated Al clusters contribute to minimizing GPU memory overhead forT- Few fine-tuned model inference?

Options:

A.

By sharing base model weights across multiple fine-tuned model’s on the same group of GPUs

B.

By optimizing GPIJ memory utilization for each model’s unique para

C.

By allocating separate GPUS for each model instance

D.

By loading the entire model into G PU memory for efficient processing

Question 12

Which Oracle Accelerated Data Science (ADS) class can be used to deploy a Large Language Model (LLM) application to OCI Data Science model deployment?

Options:

A.

RetrievalQA

B.

Text Leader

C.

Chain Deployment

D.

GenerativeAI

Page: 1 / 3
Total 40 questions