Co-authored-by: Joe Clark <31087804+jc-clark@users.noreply.github.com> Co-authored-by: Siara <108543037+SiaraMist@users.noreply.github.com>
5.8 KiB
title, intro, versions, shortTitle
| title | intro | versions | shortTitle | ||
|---|---|---|---|---|---|
| Integrating AI models into your development workflow | Call AI models in the tools you use every day. |
|
Integrate AI models |
With {% data variables.product.prodname_github_models %} extensions, you can call specific AI models from both {% data variables.product.prodname_copilot_chat_short %} and {% data variables.product.prodname_cli %}. These extensions integrate directly into your development workflow, allowing you to prompt models without context switching.
Using AI models in {% data variables.product.prodname_copilot_chat_short %}
If you have a {% data variables.product.prodname_copilot_short %} subscription, you can work with AI models in {% data variables.product.prodname_copilot_chat_short %} in two different ways:
- Using the {% data variables.product.prodname_github_models %} {% data variables.product.prodname_copilot_extension_short %}. With this extension, you can ask for model recommendations based on certain criteria and chat with specific models. See Using the {% data variables.product.prodname_github_models %} {% data variables.product.prodname_copilot_extension_short %}.
- Using multiple model support in {% data variables.product.prodname_copilot_chat_short %}. With multi-model {% data variables.product.prodname_copilot_chat_short %}, you can choose a specific model to use for a conversation, then prompt {% data variables.product.prodname_copilot_chat_short %} as usual. See AUTOTITLE and AUTOTITLE.
Using the {% data variables.product.prodname_github_models %} {% data variables.product.prodname_copilot_extension_short %}
[!NOTE] The {% data variables.product.prodname_github_models %} {% data variables.product.prodname_copilot_extension_short %} is in {% data variables.release-phases.public_preview %} and is subject to change.
-
Install the {% data variables.product.prodname_github_models %} {% data variables.product.prodname_copilot_extension_short %}.
- If you have a {% data variables.product.prodname_copilot_pro_short %} subscription, you can install the extension on your personal account.
- If you have access to {% data variables.product.prodname_copilot_short %} through a {% data variables.product.prodname_copilot_business_short %} or {% data variables.product.prodname_copilot_enterprise_short %} subscription:
- An organization owner or enterprise owner needs to enable the {% data variables.product.prodname_copilot_extensions_short %} policy for your organization or enterprise.
- An organization owner needs to install the extension for your organization.
-
Open any implementation of {% data variables.product.prodname_copilot_chat_short %} that supports {% data variables.product.prodname_copilot_extensions %}. For a list of supported {% data variables.product.prodname_copilot_chat_short %} implementations, see AUTOTITLE.
-
In the chat window, type
@models YOUR-PROMPT, then send your prompt. There are several use cases for the {% data variables.product.prodname_github_models %} {% data variables.product.prodname_copilot_extension_short %}, including:- Recommending a particular model based on context and criteria you provide. For example, you can ask for a low-cost OpenAI model that supports function calling.
- Executing prompts using a particular model. This is especially useful when you want to use a model that is not currently available in multi-model {% data variables.product.prodname_copilot_chat_short %}.
- Listing models currently available through {% data variables.product.prodname_github_models %}
Using AI models from the command line
[!NOTE] The {% data variables.product.prodname_github_models %} extension for {% data variables.product.prodname_cli %} is in {% data variables.release-phases.public_preview %} and is subject to change.
You can use the {% data variables.product.prodname_github_models %} extension for {% data variables.product.prodname_cli %} to prompt AI models from the command line, and even pipe in the output of a command as context.
Prerequisites
To use the {% data variables.product.prodname_github_models %} CLI extension, you need to have {% data variables.product.prodname_cli %} installed. {% data reusables.cli.cli-installation %}
Installing the extension
-
If you have not already authenticated to the {% data variables.product.prodname_cli %}, run the following command in your terminal.
gh auth login -
To install the {% data variables.product.prodname_github_models %} extension, run the following command.
gh extension install https://github.com/github/gh-models
Using the extension
To see a list of all available commands, run gh models.
There are a few key ways you can use the extension:
- To ask a model multiple questions using a chat experience, run
gh models run. Select your model from the listed models, then send your prompts. - To ask a model a single question, run
gh models run MODEL-NAME "QUESTION"in your terminal. For example, to ask thegpt-4omodel why the sky is blue, you can rungh models run gpt-4o "why is the sky blue?". - To provide the output of a command as context when you call a model, you can join a separate command and the call to the model with the pipe character (
|). For example, to summarize the README file in the current directory using thegpt-4omodel, you can runcat README.md | gh models run gpt-4o "summarize this text".