Oracle Cloud Infrastructure 2025 Generative AI Professional Questions and Answers
Which is a distinctive feature of GPUs in Dedicated AI Clusters used for generative AI tasks?
You create a fine-tuning dedicated AI cluster to customize a foundational model with your custom training data. How many unit hours are required for fine-tuning if the cluster is active for 10 hours?
An AI development company is working on an advanced AI assistant capable of handling queries in a seamless manner. Their goal is to create an assistant that can analyze images provided by users and generate descriptive text, as well as take text descriptions and produce accurate visual representations. Considering the capabilities, which type of model would the company likely focus on integrating into their AI assistant?
Which statement is true about Fine-tuning and Parameter-Efficient Fine-Tuning (PEFT)?
How does the temperature setting in a decoding algorithm influence the probability distribution over the vocabulary?
Given the following code:
chain = prompt | llm
Which statement is true about LangChain Expression Language (LCEL)?
Which is a key characteristic of the annotation process used in T-Few fine-tuning?
How does the temperature setting in a decoding algorithm influence the probability distribution over the vocabulary?
What is the purpose of Retrievers in LangChain?
Accuracy in vector databases contributes to the effectiveness of Large Language Models (LLMs) by preserving a specific type of relationship. What is the nature of these relationships, and why arethey crucial for language models?
Why is normalization of vectors important before indexing in a hybrid search system?
Which statement describes the difference between "Top k" and "Top p" in selecting the next token in the OCI Generative AI Generation models?
Which is a key characteristic of Large Language Models (LLMs) without Retrieval Augmented Generation (RAG)?
Which is NOT a category of pretrained foundational models available in the OCI Generative AI service?
Given the following code:
PromptTemplate(input_variables=["human_input", "city"], template=template)
Which statement is true about PromptTemplate in relation to input_variables?
What is the purpose of embeddings in natural language processing?
How does the utilization of T-Few transformer layers contribute to the efficiency of the fine-tuning process?
What happens if a period (.) is used as a stop sequence in text generation?
What does "k-shot prompting" refer to when using Large Language Models for task-specific applications?
How does the integration of a vector database into Retrieval-Augmented Generation (RAG)-based Large Language Models (LLMs) fundamentally alter their responses?
Which is a distinguishing feature of "Parameter-Efficient Fine-Tuning (PEFT)" as opposed to classic "Fine-tuning" in Large Language Model training?
What is the purpose of the "stop sequence" parameter in the OCI Generative AI Generation models?
An AI development company is working on an AI-assisted chatbot for a customer, which happens to be an online retail company. The goal is to create an assistant that can best answer queries regarding the company policies as well as retain the chat history throughout a session. Considering the capabilities, which type of model would be the best?
Which role does a "model endpoint" serve in the inference workflow of the OCI Generative AI service?
What does the term "hallucination" refer to in the context of Large Language Models (LLMs)?
Analyze the user prompts provided to a language model. Which scenario exemplifies prompt injection (jailbreaking)?