diff --git a/assets/images/help/copilot/spark-fix-all-errors.png b/assets/images/help/copilot/spark-fix-all-errors.png new file mode 100644 index 0000000000..59a4c05d12 Binary files /dev/null and b/assets/images/help/copilot/spark-fix-all-errors.png differ diff --git a/assets/images/help/copilot/spark-github-user-visibility.png b/assets/images/help/copilot/spark-github-user-visibility.png new file mode 100644 index 0000000000..030d7778d5 Binary files /dev/null and b/assets/images/help/copilot/spark-github-user-visibility.png differ diff --git a/content/copilot/concepts/copilot-billing/about-billing-for-github-spark.md b/content/copilot/concepts/copilot-billing/about-billing-for-github-spark.md new file mode 100644 index 0000000000..be3d4dad3f --- /dev/null +++ b/content/copilot/concepts/copilot-billing/about-billing-for-github-spark.md @@ -0,0 +1,34 @@ +--- +title: About billing for GitHub Spark +intro: 'Learn how {% data variables.product.prodname_spark %} is billed for users.' +versions: + feature: spark +topics: + - Copilot +shortTitle: Billing for Spark +--- + +{% data reusables.copilot.spark-business-intro %} + +> [!NOTE] +> {% data reusables.spark.preview-note-spark %} + +## Billing for {% data variables.product.prodname_spark_short %} app creation + +Each prompt consumes 4 premium requests, which draw from your plan's premium request allowance. If you or an administrator has set a budget for premium requests over your plan's allowance, additional premium requests beyond your plan’s included amount are billed at {% data variables.copilot.additional_premium_requests %} per request, meaning that one prompt to {% data variables.product.prodname_spark_short %} would cost **$0.16**. See [AUTOTITLE](/copilot/concepts/copilot-billing/understanding-and-managing-requests-in-copilot). + +## Billing and limits for {% data variables.product.prodname_spark_short %} app deployment + +You can publish apps created with {% data variables.product.prodname_spark_short %} to a deployment environment. + +Deployed apps do not currently incur any charges. However, {% data variables.product.company_short %} currently **limits usage** of deployed sparks based on criteria including number of HTTP requests, data transfer, and storage. + +* Limits apply to the billable owner, meaning if you own 10 deployed sparks, all 10 will count towards the limits. +* When any limit is reached, the spark is unpublished for the rest of the billing period. + +In the future, a new billing system will allow sparks to continue being deployed once a limit is reached, with additional usage charged to the spark's billable owner. {% data variables.product.company_short %} will publish the limits once they are confirmed following a testing period. This article will be updated when more details are available. + +## Further reading + +* [AUTOTITLE](/copilot/responsible-use-of-github-copilot-features/responsible-use-of-github-spark) +* [AUTOTITLE](/copilot/tutorials/building-ai-app-prototypes) diff --git a/content/copilot/concepts/copilot-billing/index.md b/content/copilot/concepts/copilot-billing/index.md index 546a7f0090..aeb1cb5e72 100644 --- a/content/copilot/concepts/copilot-billing/index.md +++ b/content/copilot/concepts/copilot-billing/index.md @@ -12,6 +12,7 @@ children: - /about-billing-for-individual-copilot-plans - /about-billing-for-github-copilot-in-your-organization - /about-billing-for-github-copilot-in-your-enterprise + - /about-billing-for-github-spark redirect_from: - /managing-copilot/managing-copilot-as-an-individual-subscriber/billing-and-payments - /copilot/managing-copilot/understanding-and-managing-copilot-usage diff --git a/content/copilot/concepts/copilot-billing/understanding-and-managing-requests-in-copilot.md b/content/copilot/concepts/copilot-billing/understanding-and-managing-requests-in-copilot.md index 676355ed8e..d74b3abf09 100644 --- a/content/copilot/concepts/copilot-billing/understanding-and-managing-requests-in-copilot.md +++ b/content/copilot/concepts/copilot-billing/understanding-and-managing-requests-in-copilot.md @@ -39,6 +39,7 @@ The following {% data variables.product.prodname_copilot_short %} features can u | [{% data variables.product.prodname_copilot_short %} code review](/copilot/using-github-copilot/code-review/using-copilot-code-review) | When you assign {% data variables.product.prodname_copilot_short %} as a reviewer for a pull request, **one premium request** is used each time {% data variables.product.prodname_copilot_short %} posts comments to the pull request. | | [{% data variables.copilot.copilot_extensions_short %}](/copilot/building-copilot-extensions/about-building-copilot-extensions) | {% data variables.copilot.copilot_extensions_short %} uses **one premium request** per user prompt, multiplied by the model's rate. | | [{% data variables.copilot.copilot_spaces %}](/copilot/using-github-copilot/copilot-spaces/about-organizing-and-sharing-context-with-copilot-spaces) | {% data variables.copilot.copilot_spaces %} uses **one premium request** per user prompt, multiplied by the model's rate. | +| [{% data variables.product.prodname_spark_short %}](/copilot/tutorials/building-ai-app-prototypes) | Each prompt to {% data variables.product.prodname_spark_short %} uses a fixed rate of **four premium requests**. | ## How do request allowances work per plan? diff --git a/content/copilot/get-started/github-copilot-features.md b/content/copilot/get-started/github-copilot-features.md index 49df6f2778..ad2826380a 100644 --- a/content/copilot/get-started/github-copilot-features.md +++ b/content/copilot/get-started/github-copilot-features.md @@ -69,6 +69,10 @@ Organize and centralize relevant content—like code, docs, specs, and more—in Create and manage collections of documentation to use as context for chatting with {% data variables.product.prodname_copilot_short %}. When you ask a question in {% data variables.copilot.copilot_chat_dotcom_short %} or in {% data variables.product.prodname_vscode_shortname %}, you can specify a knowledge base as the context for your question. See [AUTOTITLE](/copilot/customizing-copilot/managing-copilot-knowledge-bases). +### {% data variables.product.prodname_spark %} ({% data variables.release-phases.public_preview %}) + +Build and deploy full-stack applications using natural-language prompts that seamlessly integrate with the {% data variables.product.github %} platform for advanced development. See [AUTOTITLE](/copilot/tutorials/building-ai-app-prototypes). + ## {% data variables.product.prodname_copilot %} features for administrators The following features are available to organization and enterprise owners with a {% data variables.copilot.copilot_business_short %} or {% data variables.copilot.copilot_enterprise_short %} plan. diff --git a/content/copilot/responsible-use-of-github-copilot-features/index.md b/content/copilot/responsible-use-of-github-copilot-features/index.md index 9bc7a09237..334e507692 100644 --- a/content/copilot/responsible-use-of-github-copilot-features/index.md +++ b/content/copilot/responsible-use-of-github-copilot-features/index.md @@ -18,4 +18,5 @@ children: - /responsible-use-of-github-copilot-text-completion - /responsible-use-of-github-copilot-code-review - /responsible-use-of-copilot-coding-agent-on-githubcom + - /responsible-use-of-github-spark --- diff --git a/content/copilot/responsible-use-of-github-copilot-features/responsible-use-of-github-spark.md b/content/copilot/responsible-use-of-github-copilot-features/responsible-use-of-github-spark.md new file mode 100644 index 0000000000..8a6648a913 --- /dev/null +++ b/content/copilot/responsible-use-of-github-copilot-features/responsible-use-of-github-spark.md @@ -0,0 +1,109 @@ +--- +title: Responsible use of GitHub Spark +shortTitle: Spark +intro: 'Learn how to use {% data variables.product.prodname_spark %} responsibly by understanding its purposes, capabilities, and limitations.' +versions: + feature: spark +topics: + - Copilot + - AI +type: rai +--- + +{% data reusables.rai.spark-preview-note %} + +## About {% data variables.product.prodname_spark %} + +{% data variables.product.prodname_spark_short %} is a {% data variables.product.prodname_copilot_short %}-powered platform for creating and sharing applications (“sparks”) that can be tailored to individual needs and accessed seamlessly across desktop and mobile devices \- without requiring users to write or deploy code. + +{% data variables.product.prodname_spark_short %} offers a natural language centric development environment for application creation and a fully managed runtime environment that scales with your sparks’ needs. {% data variables.product.prodname_spark_short %} eliminates the need to manually manage infrastructure or stitch together multiple tools, letting you focus on building. + +### Input processing + +> [!NOTE] {% data variables.product.prodname_spark_short %} currently leverages {% data variables.copilot.copilot_claude_sonnet_40 %}. This model is subject to change. + +Input prompts in {% data variables.product.prodname_spark_short %} are pre-processed by {% data variables.product.prodname_copilot_short %}, augmented with contextual information from your current {% data variables.product.prodname_spark_short %} inputs and sent to a large language model powered agent within your development environment. Included context includes information from your spark such as code from your current application, previous prompts supplied in the {% data variables.product.prodname_spark_short %} interface, and any error logs from your spark’s development environment. + +The system is only designed to generate code based on submitted prompts. It is not capable of conversational interactions. English is the preferred language for submitted prompts. + +### Language model analysis + +The prompt is then passed through a large language model, which is a neural network that has been trained on a large body of text data. The language model analyzes the input prompt to help the agent reason on the task and leverage necessary tools. + +### Agent execution + +The agent which runs in your development environment accepts your prompt and the additional context passed, and decides how to update your spark to satisfy your request. The agent is able to operate your development environment by writing code, running commands, and reading execution outputs. All of the actions taken by the agent are to ensure functional, accurate code to execute your prompt. The only output from the agent is your application code. + +### {% data variables.product.prodname_spark_short %} frameworks + +The {% data variables.product.prodname_spark_short %} agent is trained to use frameworks and SDKs supplied by {% data variables.product.prodname_spark_short %} that ensure modern design and secure deployments seamlessly integrated into {% data variables.product.prodname_spark_short %}’s runtime component. The design framework is designed to be flexible and modular, enabling you to easily modify the theme to match your desired look and feel. {% data variables.product.prodname_spark_short %}’s runtime integration, accessible via the SDK, uses best practices for web deployments to ensure secure, scalable deployments. + +### Adding inference capabilities to your spark + +{% data variables.product.prodname_spark_short %}’s SDK natively integrates with {% data variables.product.prodname_github_models %}, allowing you to incorporate model inference into your spark. If {% data variables.product.prodname_spark_short %} determines that your application requires inference capabilities, it will add them using the {% data variables.product.prodname_spark_short %} SDK. + +{% data variables.product.prodname_spark_short %} gives you the tools to create, modify, and test the prompts that will be used with these inference capabilities. {% data variables.product.prodname_spark_short %} does not do any testing of the prompts that you create within your application, so you must ensure that your included capabilities act as intended. For more information on responsible use within {% data variables.product.prodname_github_models %}, see the [AUTOTITLE](/github-models/responsible-use-of-github-models). + +### Data processing + +{% data variables.product.prodname_spark_short %} collects data to operate the service - this includes prompts, suggestions, and code snippets necessary to ensuring continuity between sessions. {% data variables.product.prodname_spark_short %} also collects additional usage information including usage patterns, submitted feedback, and performance telemetry. + +## Use cases for {% data variables.product.prodname_spark_short %} + +### Building and deploying full stack web applications + +You can use {% data variables.product.prodname_spark_short %} to build full stack web applications for you using natural language. {% data variables.product.prodname_spark_short %}’s integrated runtime environment allows you to deploy these applications to the public internet. You can define permissions to these deployed applications based on {% data variables.product.github %} account visibility, allowing them to be visible to the general public, specific {% data variables.product.github %} members, members of your team or organization, or just you. Sparks can be anything \- from board game score trackers to full software-as-a-service products \- however whatever you deploy remains subject to {% data variables.product.github %}’s [Terms](/free-pro-team@latest/site-policy/github-terms/github-terms-for-additional-products-and-features#github-copilot) for user generated content. + +### Prototyping ideas + +{% data variables.product.prodname_spark_short %} helps developers, designers, product managers, or other builders rapidly prototype ideas without needing to build applications from scratch or construct complex mockups. These prototypes can be deployed for ease of sharing, or can remain unpublished as a way for builders to instantly see their vision. + +## Improving performance for {% data variables.product.prodname_spark_short %} + +{% data variables.product.prodname_spark_short %} can build a wide variety of applications, and iterate on them over time to increase complexity as new requirements are surfaced. To enhance performance and address some limitations of {% data variables.product.prodname_spark_short %}, there are various best practices you can adopt. For more information about the limitations of {% data variables.product.prodname_spark_short %}, see [Limitations of {% data variables.product.prodname_spark_short %}](#limitations-of-github-spark). + +### Keep your prompts specific and on topic + +{% data variables.product.prodname_spark_short %} is intended to build and iterate on your spark. The more specific you can be about the intended behaviors and interactions, the better the output will be from {% data variables.product.prodname_spark_short %}. Incorporating relevant context such as specific scenarios, mockups, or specifications will help {% data variables.product.prodname_spark_short %} understand your intent, which will improve the output you receive. + +{% data variables.product.prodname_spark_short %} also incorporates context from previous prompts into each subsequent revision it generates. Submitting off-topic prompts may hinder performance on subsequent revisions. Therefore try to keep your prompts as relevant as possible to the application you are building. + +### Use targeted edits appropriately + +Targeted edits in {% data variables.product.prodname_spark_short %} allow you to specify elements within your application, letting you refine style, substance, or behavior of individual elements of your application. These targeted edits are an excellent way to constrain edit surface area and express intent to {% data variables.product.prodname_spark_short %}. Using targeted edits when possible (rather than global prompts) will result in more accurate changes, as well as fewer side effects in your application as {% data variables.product.prodname_spark_short %} generates new revisions. + +### Verify {% data variables.product.prodname_spark_short %}’s output + +While {% data variables.product.prodname_spark_short %} is an extremely powerful tool, it may still make mistakes. These mistakes can be misunderstandings of your goals, or more simple syntax errors within your generated spark. You should always use {% data variables.product.prodname_spark_short %}’s provided application preview to verify that your spark behaves as intended in different scenarios. If you are comfortable with code, it is also best practice to ensure the generated code is up to your code quality standards. + +## Limitations of GitHub Spark + +### Interpretation of user intent + +{% data variables.product.prodname_spark_short %} is not always correct in its interpretation of your intent. You should always use {% data variables.product.prodname_spark_short %}’s provided preview to confirm accurate behavior within your spark. + +### Limited scope + +{% data variables.product.prodname_spark_short %} is backed by {% data variables.product.prodname_copilot_short %}, and therefore has been trained on a large body of code and relevant applications. However it may still struggle with complex or truly novel applications. {% data variables.product.prodname_spark_short %} will perform best on common/personal application scenarios (e.g. productivity tools, learning aids, life management utilities), and when the natural language instruction is provided in English. + +### Security limitations + +While {% data variables.product.prodname_spark_short %}’s runtime follows best practices for application deployment, it does generate code probabilistically, which can potentially introduce vulnerabilities especially if those vulnerabilities are common in the training set of applications. You should be careful when building applications that manage personal or sensitive data and always review and test the generated application thoroughly. + +### Legal and regulatory considerations + +Users need to evaluate potential specific legal and regulatory obligations when using any AI services and solutions, which may not be appropriate for use in every industry or scenario. Additionally, AI services or solutions are not designed for and may not be used in ways prohibited in applicable terms of service and relevant codes of conduct. + +### Offensive content + +{% data variables.product.prodname_spark_short %} has built-in protections against harmful, hateful, or offensive content. Please report any examples of offensive content to copilot-safety@github.com. Please include your spark’s URL so that we can identify the spark. + +You can report problematic or illegal content via Feedback, or you can report a spark as abuse or spam. See [AUTOTITLE](/communities/maintaining-your-safety-on-github/reporting-abuse-or-spam) and {% data variables.product.github %}'s [Content Removal Policies](/free-pro-team@latest/site-policy/content-removal-policies). + +## Further Reading + +* [AUTOTITLE](/copilot/tutorials/building-your-first-app-in-minutes-with-github-spark) +* [AUTOTITLE](/copilot/tutorials/building-ai-app-prototypes) +* [AUTOTITLE](/copilot/concepts/copilot-billing/about-billing-for-github-spark) +* [AUTOTITLE](/github-models/responsible-use-of-github-models) +* [AUTOTITLE](/free-pro-team@latest/site-policy/github-terms/github-pre-release-license-terms) diff --git a/content/copilot/tutorials/building-ai-app-prototypes.md b/content/copilot/tutorials/building-ai-app-prototypes.md new file mode 100644 index 0000000000..e2ae6d8d4b --- /dev/null +++ b/content/copilot/tutorials/building-ai-app-prototypes.md @@ -0,0 +1,183 @@ +--- +title: Building and deploying AI-powered apps with GitHub Spark +shortTitle: Build intelligent apps with Spark +allowTitleToDifferFromFilename: true +intro: 'Learn how to build and deploy an intelligent web app with natural language using {% data variables.product.prodname_spark %}.' +versions: + feature: spark +product: 'Anyone with a {% data variables.copilot.copilot_pro_plus_short %} license can use {% data variables.product.prodname_spark_short %}.' +topics: + - Copilot +--- + +> [!NOTE] +> * {% data reusables.spark.preview-note-spark %} +> * The {% data variables.product.prodname_copilot %} setting that blocks suggestions matching public code may not work as intended when using {% data variables.product.prodname_spark_short %}. See [AUTOTITLE](/copilot/how-tos/manage-your-account/managing-copilot-policies-as-an-individual-subscriber#enabling-or-disabling-suggestions-matching-public-code). + +## Introduction + +With {% data variables.product.prodname_spark %}, you can describe what you want in natural language and get a fullstack web app with data storage, AI features, and {% data variables.product.github %} authentication built in. You can iterate using prompts, visual tools, or code, and then deploy with a click to a fully managed runtime. + +{% data variables.product.prodname_spark_short %} is seamlessly integrated with {% data variables.product.github %} so you can develop your spark via a synced {% data variables.product.github %} codespace with {% data variables.product.prodname_copilot_short %} for advanced editing. You can also create a repository for team collaboration, and leverage {% data variables.product.github %}'s ecosystem of tools and integrations. + +This tutorial will guide you through building and deploying an app with {% data variables.product.prodname_spark_short %} and exploring its features. + +### Prerequisites + +* A {% data variables.product.github %} account with {% data variables.copilot.copilot_pro_plus_short %}. + +## Step 1: Create your web app + +For this tutorial, we'll create a simple marketing tool app, where: +* The user enters a description of a product they want to market. +* The app generates marketing copy, and recommends a visual strategy and target audience. + +1. Navigate to https://github.com/spark. +1. In the input field, enter a description of your app. For example: + + ```text copy + Build an app called "AI-Powered Marketing Assistant." + + The app should allow users to input a brief description of a product or service. When the user submits their brief, send this information to a generative AI model with a prompt that asks the AI to return the following: + - Persuasive and engaging marketing copy for the product or service. + - A visual strategy for how to present the product/service (e.g., suggested imagery, colors, design motifs, or mood). + - A recommendation for the ideal target audience. + The app should display these three elements clearly and in an organized manner. The app should look modern, fresh and engaging. + ``` + + > [!TIP] + > * Be specific, and provide as many details as possible for the best results. You can [{% data variables.copilot.copilot_chat_short %}](https://github.com/copilot) to refine or suggest improvements to your initial prompt. + > * Alternatively, drop a markdown document into the input field to provide {% data variables.product.prodname_spark_short %} with more context on what you're hoping to build. + +1. Optionally, upload an image to provide {% data variables.product.prodname_spark_short %} with a visual reference for your app. Mocks, sketches, or screenshots all work to provide {% data variables.product.prodname_spark_short %} with an idea of what you want to build. +1. Click **{% octicon "paper-airplane" aria-label="Submit prompt" %}** to build your app. + + > [!NOTE] + > {% data variables.product.prodname_spark_short %} will always generate a Typescript and React app. + +## Step 2: Refine and expand your app + +Once {% data variables.product.prodname_spark_short %} is done generating your app, you can test it out in the live preview window. From here, you can iterate on and expand your app using natural language, visual editing controls, or code. + +1. To make changes to your app using **natural language**, under the "Iterate" tab in the left sidebar, enter your instructions in the main input field, then submit. +1. Optionally, click one of the "Suggestions" directly above the input field in the "Iterate" tab to develop your app. +1. {% data variables.product.prodname_spark_short %} automatically alerts you to detected errors. To fix the errors, click **Fix All** above the input field in the "Iterate" tab. +1. Optionally, click **{% octicon "code" aria-hidden="true" aria-label="code" %} Code** to view and edit the underlying code. The code editing panel has {% data variables.product.prodname_copilot_short %} code completion built in. +1. To make targeted changes to a specific element of your app click the **target icon** in the top right corner then hover over and select an element in the live preview pane. + +## Step 3: Customize the styling of your app + +Next, let's change the styling of your app using {% data variables.product.prodname_spark_short %}'s built-in tools. Alternatively, you can edit the code directly. + +1. Change your app's overall appearance: + * Click the **Theme** tab to adjust typography, colors, border radius, spacing, and other visual elements. + * Choose from pre-generated themes to easily update the overall style your app. +1. To target visual edits at a specific component, click the **target icon**, then select an element of the app in the preview pane. Styling controls related to that specific element will show up in the left sidebar. +1. Optionally, edit styles in code: + * Click **{% octicon "code" aria-label="Code" %}** to open the code editor. + * Modify CSS, Tailwind CSS, or custom variables for fine-grained control (e.g., padding, spacing, fonts, colors). + + > [!TIP] + > You can import custom fonts (like Google Fonts) or add advanced styles directly in the Spark code editor. + > Ask [{% data variables.copilot.copilot_chat_short %}](https://github.com/copilot) for step-by-step guidance if you're not familiar with styling syntax. + +1. Click the **Assets** tab to upload assets you want to surface in your app. + * Add images, logos, videos, documents or other assets to personalize your app. + * Once uploaded, instruct {% data variables.product.prodname_spark_short %} on how you'd like to incorporate those assets into your app in the "Iterate" tab. + +## Step 4: Store and manage data + +If {% data variables.product.prodname_spark_short %} detects the need to store data in your app, it will automatically set up data storage for you using a key-value store. + +> [!NOTE] +> If you deploy your spark and make it visible to other users, the data in your app is **shared across all users** that can access your app. Make sure no sensitive data is included in your spark prior to updating visibility settings. + +For our marketing app, let's add data storage so that users can save their favorite pieces of marketing copy and easily access them again later: + +1. Use the following instruction in the "Iterate" tab to guide {% data variables.product.prodname_spark_short %}: + + ```text copy + Add a "Favorites" page where users can save and view their favorite marketing copy results. + ``` + +1. Interact with the app once it's done generating to test saving and retrieving favorites. +1. Check the "Data" tab to view and edit the stored values. +1. If you explicitly **don't** want {% data variables.product.prodname_spark_short %} to save data, ask {% data variables.product.prodname_spark_short %} to "store data locally" or "don't persist data". + +## Step 5: Refine AI capabilities + +Next, let's iterate on the AI capabilities included in our app, which are powered by {% data variables.product.prodname_github_models %}. + +{% data variables.product.prodname_spark_short %} automatically detects when AI is needed for features in your app. It will auto-generate the prompts for each AI feature, integrate with the best-fit models, and manage API integration and LLM inference on your behalf. + +1. Click the **Prompts** tab. +1. Review the prompts {% data variables.product.prodname_spark_short %} generated to power each of the AI features used in your app. + * In the case of our marketing app there are three separate prompts {% data variables.product.prodname_spark_short %} has generated for us (marketing copy generation, visual strategy recommendation, and target audience recommendation). +1. Click on each prompt to view and edit without needing to go into the code. Make adjustments to better fit your use case. +1. Test the app to see updated results. + +## Step 6: Edit and debug with code and {% data variables.product.prodname_copilot_short %} + +You can view or edit your app’s code directly in {% data variables.product.prodname_spark_short %} or via a synced {% data variables.product.github %} codespace. + +> [!NOTE] +> * {% data variables.product.prodname_spark_short %} uses an opinionated stack (**React**, **TypeScript**) for reliability. +> * For best results, you should **work within {% data variables.product.prodname_spark_short %}'s SDK** and core framework. +> * You can **add external libraries**, but compatibility isn’t guaranteed — you should test thoroughly. +> * Directly editing the React code **lets you add model context**, as long as you follow valid syntax and {% data variables.product.prodname_spark_short %}'s framework. + +1. To edit code in {% data variables.product.prodname_spark_short %}: + * Click **{% octicon "code" aria-label=“Code” %} Code**. + * Navigate the file tree and make any edits, with access to Copilot code completions in the editor. Changes are reflected instantly in the live preview window. +1. To make more advanced edits: + * In the top right corner, click **{% octicon "kebab-horizontal" aria-label="More actions" %}**, then click **{% octicon "codespaces" aria-label=“Open codespace” %} Open codespace** (a full-featured cloud IDE) to launch a codespace in a new browser tab. + * Once inside the codespace, click **{% octicon "copilot" aria-hidden="true" aria-label="copilot" %}** to open {% data variables.product.prodname_copilot_short %} to make more advanced changes. + * In the prompt box, select **Agent** mode to enable {% data variables.product.prodname_copilot_short %} to autonomously build, review, and troubleshoot your code. + * Select **Edit** mode for {% data variables.product.prodname_copilot_short %} to review your app's code and suggest improvements and fixes. + * Choose **Ask** mode for {% data variables.product.prodname_copilot_short %} to explain and help you understand the code or any errors you see in {% data variables.product.prodname_spark_short %}. + * Changes you make in the codespace are automatically synced to {% data variables.product.prodname_spark_short %}. + +## Step 7: Deploy and share your app + +{% data variables.product.prodname_spark_short %} comes with a fully integrated runtime environment that allows you to deploy your app in one click. + +> [!NOTE] +> If you make your spark accessible to all {% data variables.product.github %} users, all users will be able to access and edit the data stored in your spark. Make sure to delete any private or sensitive data from your app prior to making it visible to other users. + +1. In the top right corner, click **Publish**. +1. By default, your spark will be private and only accessible to you. Under "Visibility", choose whether you want your spark to remain private, or make it available to all {% data variables.product.github %} users. + + ![Screenshot of the {% data variables.product.prodname_spark %} publication menu. The "All {% data variables.product.github %} users" visibility option is outlined in orange.](/assets/images/help/copilot/spark-github-user-visibility.png) + +1. Click **Visit site** to be taken to your live, deployed app. Copy your site's URL to share with others. + > [!NOTE] + > When you publish your app, {% data variables.product.prodname_spark_short %} automatically includes cloud-based storage and LLM inference for your application to use as part of the integrated runtime. + > + > The URL for your spark is generated based on the name of your spark. You can edit the name of your app and {% data variables.product.prodname_spark_short %} will automatically manage re-routing of old URLs to your latest URL. + +## Step 8: Invite collaborators with a repository + +Now that you have a functional, deployed app, you can continue to build and collaborate on your app in the same way you would with any other {% data variables.product.github %} project, by creating and linking a {% data variables.product.github %} repository to your spark. + +1. In the top right corner, click **{% octicon "kebab-horizontal" aria-label="More actions" %}**, then click **{% octicon "repo-push" aria-hidden="true" aria-label="repo-push" %} Create repository**. +1. In dialog box that opens, click **Create**. + +A new, private repository is created under your personal account on {% data variables.product.github %}, with the name of the repository based on the name of your spark. + +Any changes made to your spark prior to repository creation will be added to your repository so you have a full record of all changes and commits made to your spark since its creation. + +There's a two-way sync between your spark and the repository, so changes made in either {% data variables.product.prodname_spark_short %} or the main branch of your repository are automatically reflected in both places. + +You can also create issues in your repository and assign them to {% data variables.copilot.copilot_coding_agent %} so it can draft pull requests for fixes and improvements. + +## Next steps + +Explore more ideas you can build with {% data variables.product.prodname_spark_short %}: +* **Prototype new ideas quickly**: if you have a specific idea for a feature or app, upload a mockup, sketch, screenshot, or even paste a markdown documentation into {% data variables.product.prodname_spark_short %} and ask {% data variables.product.prodname_spark_short %} to build out your idea. +* **Build internal tools for yourself and your team**: If you have a common workflow or process that currently sits in a document or spreadsheet, explain your workflow or process to {% data variables.product.prodname_spark_short %} and {% data variables.product.prodname_spark_short %} can turn it into an interactive web app. + +## Further reading + +* [AUTOTITLE](/copilot/responsible-use-of-github-copilot-features/responsible-use-of-github-spark) +* [AUTOTITLE](/copilot/concepts/copilot-billing/about-billing-for-github-spark) +* [AUTOTITLE](/free-pro-team@latest/site-policy/github-terms/github-pre-release-license-terms) diff --git a/content/copilot/tutorials/building-your-first-app-in-minutes-with-github-spark.md b/content/copilot/tutorials/building-your-first-app-in-minutes-with-github-spark.md new file mode 100644 index 0000000000..53c01ff225 --- /dev/null +++ b/content/copilot/tutorials/building-your-first-app-in-minutes-with-github-spark.md @@ -0,0 +1,79 @@ +--- +title: Building your first app in minutes with GitHub Spark +shortTitle: Easy apps with Spark +intro: "Learn how to use {% data variables.product.prodname_spark %} to quickly create and deploy an app without writing any code." +allowTitleToDifferFromFilename: true +versions: + feature: spark +product: 'Anyone with a {% data variables.copilot.copilot_pro_plus_short %} license can use {% data variables.product.prodname_spark_short %}.' +--- + +Have you ever had a great idea for an app, but you didn't have the tools to build it? With the help of AI, you can now bring your app ideas to life in minutes using only natural language. In this article, we'll use {% data variables.product.prodname_spark %} to build, improve, and share a word search app without writing a single line of code ourselves. + +> [!NOTE] +> {% data reusables.spark.preview-note-spark %} + +## Creating a prototype of your app + +Let's start by generating an initial, basic version of our app that we can build on later. + +1. Navigate to https://github.com/spark. +1. Send the following prompt to generate the first iteration of your app: + + ```text copy + Please create a word search game. The game should take in a set of words from the user, then create a word search puzzle containing those words, as well as a word bank listing the words. Words in the puzzle can be horizontal, vertical, diagonal, forwards, and backwards, and are "found" when the user clicks and drags their mouse across them. Once all words are found, give the user the option to create a new puzzle. + ``` + +1. Watch as {% data variables.product.prodname_spark_short %} builds your app in real time! You'll know the app is done generating when the preview appears. +1. To test your app, create and solve a puzzle using the preview. + +## Improving your app + +Just like that, we have a working app! However, it still needs some tweaks. Let's give {% data variables.product.prodname_spark_short %} some additional prompts to polish our project. + +1. At the left side of the page, in the **Iterate** tab, send the following prompt: + + ```text copy + Please add a leaderboard and a timer to the game. The timer should start when the user generates a new puzzle, then stop when all words are found. The user should then be able to enter their name, and their name, time, and the number of words in their puzzle should be displayed on the leaderboard. The leaderboard should be sortable in ascending and descending order by each of the three categories. + ``` + +1. Once the app is updated, create and solve another puzzle to see the new features in action. +1. Get creative and make your own improvements to the app! If you're feeling stuck, pick one of the suggestions {% data variables.product.prodname_spark_short %} provides above the prompt text box. You can also make changes using the visual editing controls in the "Theme", "Data", and "Prompts" tabs, without ever having to touch code. + +## Debugging your app + +While you're building your app, you may encounter some errors. Often, {% data variables.product.prodname_spark_short %} will identify these issues and list them in an "Errors" pop up above the prompt text box. To fix the errors, click **Fix all**. + +![Screenshot of errors identified by {% data variables.product.prodname_spark %}. The "Fix all" button is outlined in orange.](/assets/images/help/copilot/spark-fix-all-errors.png) + +If you find an error that {% data variables.product.prodname_spark_short %} itself didn't flag, write a prompt to fix it. For best results, provide a detailed description of the error, as well as the ideal fixed state. For example, if you notice that adding words over a certain number of characters causes the puzzle to render incorrectly, send the following prompt: + +```text copy +Please prevent users from entering words longer than the number of rows or columns in the puzzle. Additionally, add an option to change the size of a puzzle. If the user tries to enter a word that's longer than the current size of the puzzle, display an error message telling them that provided words must be less than or equal to the size of the puzzle. +``` + +## Sharing your app + +Now that you're happy with your app, let's deploy it so you can share it with others. + +> [!NOTE] +> If you make your spark accessible to all {% data variables.product.github %} users, all users will be able to access and edit the data stored in your spark. Make sure to delete any private or sensitive data from your app prior to making it visible to other users. + +1. In the top-right corner of the page, click **Publish**. +By default, your spark is deployed as private and only accessible to you. To let other {% data variables.product.github %} users access your app, in the **Visibility** section of the publication dropdown, choose {% octicon "id-badge" aria-hidden="true" aria-label="id-badge" %} **All {% data variables.product.github %} users**. This allows anyone with a {% data variables.product.github %} account to access your spark. + + ![Screenshot of the {% data variables.product.prodname_spark %} publication menu. The "All {% data variables.product.github %} users" visibility option is outlined in orange.](/assets/images/help/copilot/spark-github-user-visibility.png) + +1. Click **View site** {% octicon "link-external" aria-hidden="true" aria-label="link-external" %} to see your deployed app, then copy and share your app's URL. + +## Next steps + +We just created a word search app, but {% data variables.product.prodname_spark_short %} can make all kinds of web apps! Try [creating a new app](https://github.com/spark) on your own. If you need some inspiration, here are a few ideas to get you started: + +* Try building a **news aggregator app** or an **intelligent recipe generator**. +* Build a **budget tracker** that lets you set a budget, takes in a list of expenses, and displays your total remaining budget. You can give each expense a category and date, then sort the expenses by the many different categories. + +## Further reading + +* [AUTOTITLE](/copilot/responsible-use-of-github-copilot-features/responsible-use-of-github-spark) +* [AUTOTITLE](/copilot/concepts/copilot-billing/about-billing-for-github-spark) diff --git a/content/copilot/tutorials/index.md b/content/copilot/tutorials/index.md index 28043d1aa2..d502aeadf8 100644 --- a/content/copilot/tutorials/index.md +++ b/content/copilot/tutorials/index.md @@ -18,6 +18,8 @@ children: - /writing-tests-with-github-copilot - /refactoring-code-with-github-copilot - /learning-a-new-programming-language-with-github-copilot + - /building-your-first-app-in-minutes-with-github-spark + - /building-ai-app-prototypes - /modernizing-legacy-code-with-github-copilot - /using-copilot-to-migrate-a-project - /upgrading-projects-with-github-copilot diff --git a/data/features/spark.yml b/data/features/spark.yml new file mode 100644 index 0000000000..f6715c0d95 --- /dev/null +++ b/data/features/spark.yml @@ -0,0 +1,5 @@ +# Reference: #17636 +# Spark (No-code app builder, public preview) +versions: + fpt: '*' + ghec: '*' diff --git a/data/reusables/copilot/differences-cfi-cfb-table.md b/data/reusables/copilot/differences-cfi-cfb-table.md index 169ce2a53b..99d7628ccd 100644 --- a/data/reusables/copilot/differences-cfi-cfb-table.md +++ b/data/reusables/copilot/differences-cfi-cfb-table.md @@ -81,6 +81,7 @@ | Content exclusion | {% octicon "x" aria-label="Not included" %} | {% octicon "x" aria-label="Not included" %} | {% octicon "x" aria-label="Not included" %} | {% octicon "check" aria-label="Included" %} | {% octicon "check" aria-label="Included" %} | | {% data variables.product.prodname_copilot_short %} knowledge bases |{% octicon "x" aria-label="Not included" %} | {% octicon "x" aria-label="Not included" %} | {% octicon "x" aria-label="Not included" %} | {% octicon "x" aria-label="Not included" %} | {% octicon "check" aria-label="Included" %} | | {% data variables.copilot.copilot_cli_short %} | {% octicon "check" aria-label="Included" %} | {% octicon "check" aria-label="Included" %} | {% octicon "check" aria-label="Included" %} | {% octicon "check" aria-label="Included" %} | {% octicon "check" aria-label="Included" %} | +| {% data variables.product.prodname_spark %} ({% data variables.release-phases.public_preview %}) | {% octicon "x" aria-label="Not included" %} | {% octicon "x" aria-label="Not included" %} | {% octicon "check" aria-label="Included" %} | {% octicon "x" aria-label="Not included" %} | {% octicon "x" aria-label="Not included" %} | {% endrowheaders %} diff --git a/data/reusables/copilot/spark-business-intro.md b/data/reusables/copilot/spark-business-intro.md new file mode 100644 index 0000000000..ddfd5aec33 --- /dev/null +++ b/data/reusables/copilot/spark-business-intro.md @@ -0,0 +1 @@ +{% data variables.product.prodname_spark %} allows users to build applications using natural-language prompts, then share the apps with teammates or deploy them to production. diff --git a/data/reusables/rai/spark-preview-note.md b/data/reusables/rai/spark-preview-note.md new file mode 100644 index 0000000000..fb12c55f97 --- /dev/null +++ b/data/reusables/rai/spark-preview-note.md @@ -0,0 +1,2 @@ +> [!NOTE] +> {% data variables.product.prodname_spark %} is in public preview and subject to change. diff --git a/data/reusables/spark/preview-note-spark.md b/data/reusables/spark/preview-note-spark.md new file mode 100644 index 0000000000..56f703614d --- /dev/null +++ b/data/reusables/spark/preview-note-spark.md @@ -0,0 +1 @@ +{% data variables.product.prodname_spark %} is in public preview and subject to change. diff --git a/data/variables/product.yml b/data/variables/product.yml index 495c2cab8b..dcde2a2ab0 100644 --- a/data/variables/product.yml +++ b/data/variables/product.yml @@ -301,6 +301,10 @@ prodname_arctic_vault: 'Arctic Code Vault' prodname_copilot: 'GitHub Copilot' prodname_copilot_short: 'Copilot' +# GitHub Spark +prodname_spark: 'GitHub Spark' +prodname_spark_short: 'Spark' + # Windows prodname_windows_terminal: 'Windows Terminal'