Merge branch 'main' into main
This commit is contained in:
@@ -13,6 +13,7 @@ module.exports = {
|
||||
babelOptions: { configFile: './.babelrc' },
|
||||
sourceType: 'module',
|
||||
},
|
||||
ignorePatterns: ['tmp/*'],
|
||||
rules: {
|
||||
'import/no-extraneous-dependencies': ['error', { packageDir: '.' }],
|
||||
},
|
||||
|
||||
2
.github/PULL_REQUEST_TEMPLATE.md
vendored
2
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -25,7 +25,7 @@ Closes [issue link]
|
||||
### Check off the following:
|
||||
|
||||
- [ ] I have reviewed my changes in staging (look for "Automatically generated comment" and click **Modified** to view your latest changes).
|
||||
- [ ] For content changes, I have completed the [self-review checklist](https://github.com/github/docs/blob/main/CONTRIBUTING.md#self-review).
|
||||
- [ ] For content changes, I have completed the [self-review checklist](https://github.com/github/docs/blob/main/contributing/self-review.md#self-review).
|
||||
|
||||
### Writer impact (This section is for GitHub staff members only):
|
||||
|
||||
|
||||
12
.github/workflows/test-windows.yml
vendored
12
.github/workflows/test-windows.yml
vendored
@@ -23,7 +23,17 @@ jobs:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
test-group: [content, graphql, meta, rendering, routing, unit, linting]
|
||||
test-group:
|
||||
[
|
||||
content,
|
||||
graphql,
|
||||
meta,
|
||||
rendering,
|
||||
routing,
|
||||
unit,
|
||||
linting,
|
||||
translations,
|
||||
]
|
||||
steps:
|
||||
- name: Check out repo
|
||||
uses: actions/checkout@1e204e9a9253d643386038d443f96446fa156a97
|
||||
|
||||
12
.github/workflows/test.yml
vendored
12
.github/workflows/test.yml
vendored
@@ -26,7 +26,17 @@ jobs:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
# The same array lives in test-windows.yml, so make any updates there too.
|
||||
test-group: [content, graphql, meta, rendering, routing, unit, linting]
|
||||
test-group:
|
||||
[
|
||||
content,
|
||||
graphql,
|
||||
meta,
|
||||
rendering,
|
||||
routing,
|
||||
unit,
|
||||
linting,
|
||||
translations,
|
||||
]
|
||||
steps:
|
||||
# Each of these ifs needs to be repeated at each step to make sure the required check still runs
|
||||
# Even if if doesn't do anything
|
||||
|
||||
15
README.md
15
README.md
@@ -8,21 +8,21 @@ Use the table of contents icon <img src="./assets/images/table-of-contents.png"
|
||||
|
||||
## Contributing
|
||||
|
||||
See [the contributing guide](CONTRIBUTING.md) for detailed instructions of how to get started with our project.
|
||||
See [the contributing guide](CONTRIBUTING.md) for detailed instructions on how to get started with our project.
|
||||
|
||||
We accept different [types of contributions](CONTRIBUTING.md/#types-of-contributions-memo), including some that don't require you to write a single line of code.
|
||||
We accept different [types of contributions](https://github.com/github/docs/blob/main/contributing/types-of-contributions.md), including some that don't require you to write a single line of code.
|
||||
|
||||
On the GitHub Docs site, you can click the make a contribution button to open a PR for quick fixes like typos, updates, or link fixes.
|
||||
On the GitHub Docs site, you can click the make a contribution button to open a PR(Pull Request) for quick fixes like typos, updates, or link fixes.
|
||||
|
||||
<img src="./assets/images/contribution_cta.png" width="400">
|
||||
|
||||
For more complex contributions, you can open an issue using the most appropriate [issue template](https://github.com/github/docs/issues/new/choose) to describe the changes you'd like to see.
|
||||
For more complex contributions, you can open an issue using the most appropriate [issue template](https://github.com/github/docs/issues/new/choose) to describe the changes you'd like to see. By this way you can also be a part of Open source contributor's community without even writing a single line of code.
|
||||
|
||||
If you're looking for a way to contribute, you can scan through our [existing issues](https://github.com/github/docs/issues) for something to work on. When ready, check out [Getting Started with Contributing](/CONTRIBUTING.md) for detailed instructions.
|
||||
|
||||
### Join us in discussions
|
||||
|
||||
We use GitHub Discussions to talk about all sorts of topics related to documentation and this site. For example: if you'd like help troubleshooting a PR, have a great new idea, or want to share something amazing you've learned in our docs, join us in [discussions](https://github.com/github/docs/discussions).
|
||||
We use GitHub Discussions to talk about all sorts of topics related to documentation and this site. For example: if you'd like help troubleshooting a PR, have a great new idea, or want to share something amazing you've learned in our docs, join us in the [discussions](https://github.com/github/docs/discussions).
|
||||
|
||||
### And that's it!
|
||||
|
||||
@@ -33,6 +33,7 @@ That's how you can easily become a member of the GitHub Documentation community.
|
||||
## READMEs
|
||||
|
||||
In addition to the README you're reading right now, this repo includes other READMEs that describe the purpose of each subdirectory in more detail:
|
||||
YOu can go through among them for specified details regarding the topics listed below.
|
||||
|
||||
- [content/README.md](content/README.md)
|
||||
- [content/graphql/README.md](content/graphql/README.md)
|
||||
@@ -54,9 +55,9 @@ In addition to the README you're reading right now, this repo includes other REA
|
||||
|
||||
The GitHub product documentation in the assets, content, and data folders are licensed under a [CC-BY license](LICENSE).
|
||||
|
||||
All other code in this repository is licensed under a [MIT license](LICENSE-CODE).
|
||||
All other code in this repository is licensed under the [MIT license](LICENSE-CODE).
|
||||
|
||||
When using the GitHub logos, be sure to follow the [GitHub logo guidelines](https://github.com/logos).
|
||||
When you are using the GitHub logos, be sure to follow the [GitHub logo guidelines](https://github.com/logos).
|
||||
|
||||
## Thanks :purple_heart:
|
||||
|
||||
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 7.2 KiB After Width: | Height: | Size: 11 KiB |
@@ -16,6 +16,26 @@ const SectionToLabelMap: Record<string, string> = {
|
||||
backups: 'Backups',
|
||||
}
|
||||
|
||||
const LabelColorMap = {
|
||||
features: 'color-bg-success-emphasis',
|
||||
bugs: 'color-bg-attention-emphasis',
|
||||
known_issues: 'color-bg-accent-emphasis',
|
||||
security_fixes: 'color-bg-sponsors-emphasis',
|
||||
changes: 'color-bg-success-emphasis',
|
||||
deprecations: 'color-bg-done-emphasis',
|
||||
backups: 'color-bg-severe-emphasis',
|
||||
}
|
||||
|
||||
const HeadingColorMap = {
|
||||
features: 'color-fg-success',
|
||||
bugs: 'color-fg-attention',
|
||||
known_issues: 'color-fg-accent',
|
||||
security_fixes: 'color-fg-sponsors',
|
||||
changes: 'color-fg-success',
|
||||
deprecations: 'color-fg-done',
|
||||
backups: 'color-fg-severe',
|
||||
}
|
||||
|
||||
type Props = {
|
||||
patch: ReleaseNotePatch
|
||||
withReleaseNoteLabel?: boolean
|
||||
@@ -25,6 +45,10 @@ export function PatchNotes({ patch, withReleaseNoteLabel }: Props) {
|
||||
<>
|
||||
{Object.entries(patch.sections).map(([key, sectionItems], i, arr) => {
|
||||
const isLast = i === arr.length - 1
|
||||
const primaryLabelColor =
|
||||
LabelColorMap[key as keyof typeof LabelColorMap] || LabelColorMap.features
|
||||
const primaryHeadingColor =
|
||||
HeadingColorMap[key as keyof typeof HeadingColorMap] || HeadingColorMap.features
|
||||
return (
|
||||
<div
|
||||
key={key}
|
||||
@@ -36,7 +60,12 @@ export function PatchNotes({ patch, withReleaseNoteLabel }: Props) {
|
||||
>
|
||||
{withReleaseNoteLabel && (
|
||||
<div className="col-12 col-xl-3 mb-5">
|
||||
<span className="px-3 py-2 text-small text-bold text-uppercase color-bg-emphasis color-fg-on-emphasis">
|
||||
<span
|
||||
className={cx(
|
||||
'px-3 py-2 color-fg-on-emphasis text-small text-bold text-uppercase',
|
||||
primaryLabelColor
|
||||
)}
|
||||
>
|
||||
{SectionToLabelMap[key] || 'INVALID SECTION'}
|
||||
</span>
|
||||
</div>
|
||||
@@ -52,7 +81,11 @@ export function PatchNotes({ patch, withReleaseNoteLabel }: Props) {
|
||||
<Fragment key={slug}>
|
||||
<h4
|
||||
id={slug}
|
||||
className={cx(styles.sectionHeading, 'text-uppercase text-bold f4')}
|
||||
className={cx(
|
||||
styles.sectionHeading,
|
||||
primaryHeadingColor,
|
||||
'text-uppercase text-bold f4'
|
||||
)}
|
||||
>
|
||||
<Link href={`#${slug}`} className="color-fg-inherit">
|
||||
{item.heading}
|
||||
|
||||
@@ -23,6 +23,7 @@ shortTitle: Merge multiple user accounts
|
||||
1. [Transfer any repositories](/articles/how-to-transfer-a-repository) from the account you want to delete to the account you want to keep. Issues, pull requests, and wikis are transferred as well. Verify the repositories exist on the account you want to keep.
|
||||
2. [Update the remote URLs](/github/getting-started-with-github/managing-remote-repositories) in any local clones of the repositories that were moved.
|
||||
3. [Delete the account](/articles/deleting-your-user-account) you no longer want to use.
|
||||
4. To attribute past commits to the new account, add the email address you used to author the commits to the account you're keeping. For more information, see "[Why are my contributions not showing up on my profile?](/account-and-profile/setting-up-and-managing-your-github-profile/managing-contribution-graphs-on-your-profile/why-are-my-contributions-not-showing-up-on-my-profile#your-local-git-commit-email-isnt-connected-to-your-account)"
|
||||
|
||||
## Further reading
|
||||
|
||||
|
||||
@@ -57,7 +57,7 @@ Before you begin, you'll need to create a {% data variables.product.prodname_dot
|
||||
|
||||
## Creating a Dockerfile
|
||||
|
||||
In your new `hello-world-docker-action` directory, create a new `Dockerfile` file. For more information, see "[Dockerfile support for {% data variables.product.prodname_actions %}](/actions/creating-actions/dockerfile-support-for-github-actions)."
|
||||
In your new `hello-world-docker-action` directory, create a new `Dockerfile` file. Make sure that your filename is capitalized correctly (use a capital `D` but not a capital `f`) if you're having issues. For more information, see "[Dockerfile support for {% data variables.product.prodname_actions %}](/actions/creating-actions/dockerfile-support-for-github-actions)."
|
||||
|
||||
**Dockerfile**
|
||||
```Dockerfile{:copy}
|
||||
|
||||
@@ -44,7 +44,7 @@ By default, the validation only includes the audience (`aud`) condition, so you
|
||||
"Condition": {
|
||||
"StringEquals": {
|
||||
"token.actions.githubusercontent.com:aud": "https://github.com/octo-org",
|
||||
"token.actions.githubusercontent.com:sub": "token.actions.githubusercontent.com:sub": "repo:octo-org/octo-repo:ref:refs/heads/octo-branch"
|
||||
"token.actions.githubusercontent.com:sub": "repo:octo-org/octo-repo:ref:refs/heads/octo-branch"
|
||||
```
|
||||
|
||||
## Updating your {% data variables.product.prodname_actions %} workflow
|
||||
|
||||
@@ -33,8 +33,8 @@ This guide gives an overview of how to configure Azure to trust {% data variable
|
||||
|
||||
To configure the OIDC identity provider in Azure, you will need to perform the following configuration. For instructions on making these changes, refer to [the Azure documentation](https://docs.microsoft.com/en-us/azure/developer/github/connect-from-azure).
|
||||
|
||||
1. Create an Active Directory application and a service principal.
|
||||
2. Add federated credentials for the Active Directory application.
|
||||
1. Create an Azure Active Directory application and a service principal.
|
||||
2. Add federated credentials for the Azure Active Directory application.
|
||||
3. Create {% data variables.product.prodname_dotcom %} secrets for storing Azure configuration.
|
||||
|
||||
Additional guidance for configuring the identity provider:
|
||||
@@ -97,4 +97,4 @@ jobs:
|
||||
tenant-id: {% raw %}${{ secrets.AZURE_TENANTID }}{% endraw %}
|
||||
subscription-id: {% raw %}${{ secrets.AZURE_SUBSCRIPTIONID }}{% endraw %}
|
||||
```
|
||||
|
||||
|
||||
|
||||
@@ -23,7 +23,7 @@ You can automatically increase or decrease the number of self-hosted runners in
|
||||
|
||||
The following repositories have detailed instructions for setting up these autoscalers:
|
||||
|
||||
- [actions-runner-controller/actions-runner-controller](https://github.com/actions-runner-controller/actions-runner-controller) - A Kubernetes controller for {% data variables.product.prodname_actions %} self-hosted runnners.
|
||||
- [actions-runner-controller/actions-runner-controller](https://github.com/actions-runner-controller/actions-runner-controller) - A Kubernetes controller for {% data variables.product.prodname_actions %} self-hosted runners.
|
||||
- [philips-labs/terraform-aws-github-runner](https://github.com/philips-labs/terraform-aws-github-runner) - A Terraform module for scalable {% data variables.product.prodname_actions %} runners on Amazon Web Services.
|
||||
|
||||
Each solution has certain specifics that may be important to consider:
|
||||
@@ -77,4 +77,4 @@ To authenticate using a {% data variables.product.prodname_dotcom %} App, it mu
|
||||
|
||||
You can register and delete enterprise self-hosted runners using [the API](/rest/reference/enterprise-admin#github-actions). To authenticate to the API, your autoscaling implementation can use an access token.
|
||||
|
||||
Your access token will requite the `manage_runners:enterprise` scope.
|
||||
Your access token will require the `manage_runners:enterprise` scope.
|
||||
|
||||
@@ -41,6 +41,7 @@ Contexts are a way to access information about workflow runs, runner environment
|
||||
| `strategy` | `object` | Enables access to the configured strategy parameters and information about the current job. Strategy parameters include `fail-fast`, `job-index`, `job-total`, and `max-parallel`. |
|
||||
| `matrix` | `object` | Enables access to the matrix parameters you configured for the current job. For example, if you configure a matrix build with the `os` and `node` versions, the `matrix` context object includes the `os` and `node` versions of the current job. |
|
||||
| `needs` | `object` | Enables access to the outputs of all jobs that are defined as a dependency of the current job. For more information, see [`needs` context](#needs-context). |
|
||||
| `inputs` | `object` | Enables access to the inputs of reusable workflow. For more information, see [`inputs` context](#inputs-context). |
|
||||
|
||||
As part of an expression, you may access context information using one of two syntaxes.
|
||||
- Index syntax: `github['sha']`
|
||||
@@ -148,6 +149,17 @@ The `needs` context contains outputs from all jobs that are defined as a depende
|
||||
| `needs.<job id>.outputs.<output name>` | `string` | The value of a specific output for a job that the current job depends on. |
|
||||
| `needs.<job id>.result` | `string` | The result of a job that the current job depends on. Possible values are `success`, `failure`, `cancelled`, or `skipped`. |
|
||||
|
||||
### `inputs` context
|
||||
|
||||
The `inputs` context contains information about the inputs of reusable workflow. The inputs are defined in [`workflow_call` event configuration](/actions/learn-github-actions/events-that-trigger-workflows#workflow-reuse-events). These inputs are passed from [`jobs.<job_id>.with`](/actions/learn-github-actions/workflow-syntax-for-github-actions#jobsjob_idwith) in an external workflow.
|
||||
|
||||
For more information, see "[Reusing workflows](/actions/learn-github-actions/reusing-workflows)".
|
||||
|
||||
| Property name | Type | Description |
|
||||
|---------------|------|-------------|
|
||||
| `inputs` | `object` | This context is only available when it is [a reusable workflow](/actions/learn-github-actions/reusing-workflows). |
|
||||
| `inputs.<name>` | `string` or `number` or `boolean` | Each input value passed from an external workflow. |
|
||||
|
||||
#### Example printing context information to the log file
|
||||
|
||||
To inspect the information that is accessible in each context, you can use this workflow file example.
|
||||
|
||||
@@ -139,7 +139,7 @@ After you enable LDAP sync, a synchronization job will run at the specified time
|
||||
A synchronization job will also run at the specified time interval to perform the following operations on each team that has been mapped to an LDAP group:
|
||||
|
||||
- If a team's corresponding LDAP group has been removed, remove all members from the team.
|
||||
- If LDAP member entries have been removed from the LDAP group, remove the corresponding users from the team. If the user loses access to any repositories as a result, delete any private forks the user has of those repositories.
|
||||
- If LDAP member entries have been removed from the LDAP group, remove the corresponding users from the team. If the user is no longer a member of any team in the organization, remove the user from the organization. If the user loses access to any repositories as a result, delete any private forks the user has of those repositories.
|
||||
- If LDAP member entries have been added to the LDAP group, add the corresponding users to the team. If the user regains access to any repositories as a result, restore any private forks of the repositories that were deleted because the user lost access in the past 90 days.
|
||||
|
||||
{% data reusables.enterprise_user_management.ldap-sync-nested-teams %}
|
||||
|
||||
@@ -22,7 +22,7 @@ Geo DNS, such as [Amazon's Route 53 service](http://docs.aws.amazon.com/Route53/
|
||||
|
||||
## Limitations
|
||||
|
||||
Writing requests to the replica requires sending the data to the primary and all replicas. This means that the performance of all writes are limited by the slowest replica, although new geo-replicas can seed the majority of their data from existing co-located geo-replicas, rather than from the primary. Geo-replication will not add capacity to a {% data variables.product.prodname_ghe_server %} instance or solve performance issues related to insufficient CPU or memory resources. If the primary appliance is offline, active replicas will be unable to serve any read or write requests.
|
||||
Writing requests to the replica requires sending the data to the primary and all replicas. This means that the performance of all writes is limited by the slowest replica, although new geo-replicas can seed the majority of their data from existing co-located geo-replicas, rather than from the primary. Geo-replication will not add capacity to a {% data variables.product.prodname_ghe_server %} instance or solve performance issues related to insufficient CPU or memory resources. If the primary appliance is offline, active replicas will be unable to serve any read or write requests.
|
||||
|
||||
{% data reusables.enterprise_installation.replica-limit %}
|
||||
|
||||
|
||||
@@ -49,7 +49,7 @@ Repository administrators can enforce required commit signing on a branch to blo
|
||||
{% data reusables.identity-and-permissions.verification-status-check %}
|
||||
|
||||
{% ifversion fpt or ghec %}
|
||||
{% data variables.product.product_name %} will automatically use GPG to sign commits you make using the {% data variables.product.product_name %} web interface, except for when you squash and merge a pull request that you are not the author of. Commits signed by {% data variables.product.product_name %} will have a verified status on {% data variables.product.product_name %}. You can verify the signature locally using the public key available at https://github.com/web-flow.gpg. The full fingerprint of the key is `5DE3 E050 9C47 EA3C F04A 42D3 4AEE 18F8 3AFD EB23`. You can optionally choose to have {% data variables.product.product_name %} sign commits you make in {% data variables.product.prodname_codespaces %}. For more information about enabling GPG verification for your codespaces, see "[Managing GPG verification for {% data variables.product.prodname_codespaces %}](/github/developing-online-with-codespaces/managing-gpg-verification-for-codespaces)."
|
||||
{% data variables.product.product_name %} will automatically use GPG to sign commits you make using the {% data variables.product.product_name %} web interface. Commits signed by {% data variables.product.product_name %} will have a verified status on {% data variables.product.product_name %}. You can verify the signature locally using the public key available at https://github.com/web-flow.gpg. The full fingerprint of the key is `5DE3 E050 9C47 EA3C F04A 42D3 4AEE 18F8 3AFD EB23`. You can optionally choose to have {% data variables.product.product_name %} sign commits you make in {% data variables.product.prodname_codespaces %}. For more information about enabling GPG verification for your codespaces, see "[Managing GPG verification for {% data variables.product.prodname_codespaces %}](/github/developing-online-with-codespaces/managing-gpg-verification-for-codespaces)."
|
||||
{% endif %}
|
||||
|
||||
## GPG commit signature verification
|
||||
|
||||
@@ -22,7 +22,7 @@ The `-K` option is in Apple's standard version of `ssh-add`, which stores the pa
|
||||
To add your SSH private key to the ssh-agent, you can specify the path to the Apple version of `ssh-add`:
|
||||
|
||||
```shell
|
||||
$ /usr/bin/ssh-add -K ~/.ssh/id_rsa
|
||||
$ /usr/bin/ssh-add -K ~/.ssh/id_ed25519
|
||||
```
|
||||
|
||||
{% note %}
|
||||
|
||||
@@ -107,7 +107,7 @@ The "Used by" section represents a single package from the repository. If you ha
|
||||
If your dependency graph is empty, there may be a problem with the file containing your dependencies. Check the file to ensure that it's correctly formatted for the file type.
|
||||
|
||||
{% ifversion fpt or ghec %}
|
||||
If the file is correctly formatted, then check its size. The dependency graph ignores individual manifest and lock files that are over 0.5 Mb, unless you are a {% data variables.product.prodname_enterprise %} user. It processes up to 20 manifest or lock files per repository by default, so you can split dependencies into smaller files in subdirectories of the repository.{% endif %}
|
||||
If the file is correctly formatted, then check its size. The dependency graph ignores individual manifest and lock files that are over 1.5 Mb, unless you are a {% data variables.product.prodname_enterprise %} user. It processes up to 20 manifest or lock files per repository by default, so you can split dependencies into smaller files in subdirectories of the repository.{% endif %}
|
||||
|
||||
If a manifest or lock file is not processed, its dependencies are omitted from the dependency graph and they can't be checked for vulnerable dependencies.
|
||||
|
||||
|
||||
@@ -32,6 +32,8 @@ When you create a codespace, a number of steps happen to create and connect you
|
||||
|
||||
For more information on what happens when you create a codespace, see "[Deep Dive](/codespaces/getting-started/deep-dive)."
|
||||
|
||||
If you want to use Git hooks for your codespace, then you should set up hooks using the [`devcontainer.json` lifecycle scripts](https://code.visualstudio.com/docs/remote/devcontainerjson-reference#_lifecycle-scripts), such as `postCreateCommand`, during step 4. Since your codespace container is created after the repository is cloned, any [git template directory](https://git-scm.com/docs/git-init#_template_directory) configured in the container image will not apply to your codespace. Hooks must instead be installed after the codespace is created. For more information on using `postCreateCommand`, see the [`devcontainer.json` reference](https://code.visualstudio.com/docs/remote/devcontainerjson-reference#_devcontainerjson-properties) in the Visual Studio Code documentation.
|
||||
|
||||
{% data reusables.codespaces.use-visual-studio-features %}
|
||||
|
||||
{% data reusables.codespaces.you-can-see-all-your-codespaces %}
|
||||
|
||||
@@ -27,6 +27,7 @@ Your codespace can be ephemeral if you need to test something or you can return
|
||||
Once you've selected the option to create a new codespace, some steps happen in the background before the codespace is available to you.
|
||||
|
||||

|
||||
|
||||
### Step 1: VM and storage are assigned to your codespace
|
||||
|
||||
When you create a codespace, a [shallow clone](https://github.blog/2020-12-21-get-up-to-speed-with-partial-clone-and-shallow-clone/) of your repository is made on a Linux virtual machine that is both dedicated and private to you. Having a dedicated VM ensures that you have the entire set of compute resources from that machine available to you. If necessary, this also allows you to have full root access to your container.
|
||||
@@ -35,13 +36,25 @@ When you create a codespace, a [shallow clone](https://github.blog/2020-12-21-ge
|
||||
|
||||
{% data variables.product.prodname_codespaces %} uses a container as the development environment. This container is created based on the configurations that you can define in a `devcontainer.json` file and/or Dockerfile in your repository. If you don't [configure a container](/codespaces/customizing-your-codespace/configuring-codespaces-for-your-project), {% data variables.product.prodname_codespaces %} uses a [default image](/codespaces/customizing-your-codespace/configuring-codespaces-for-your-project#using-the-default-configuration), which has many languages and runtimes available. For information on what the default image contains, see the [`vscode-dev-containers`](https://github.com/microsoft/vscode-dev-containers/tree/main/containers/codespaces-linux) repository.
|
||||
|
||||
{% note %}
|
||||
|
||||
**Note:** If you want to use Git hooks in your codespace and apply anything in the [git template directory](https://git-scm.com/docs/git-init#_template_directory) to your codespace, then you must set up hooks during step 4 after the container is created.
|
||||
|
||||
Since your repository is cloned onto the host VM before the container is created, anything in the [git template directory](https://git-scm.com/docs/git-init#_template_directory) will not apply in your codespace unless you set up hooks in your `devcontainer.json` configuration file using the `postCreateCommand` in step 4. For more information, see "[Step 4: Post-creation setup](#step-4-post-creation-setup)."
|
||||
|
||||
{% endnote %}
|
||||
|
||||
### Step 3: Connecting to the codespace
|
||||
|
||||
When your container has been created and any other initialization has run, you'll be connected to your codespace. You can connect to it through the web or via [Visual Studio Code](/codespaces/developing-in-codespaces/using-codespaces-in-visual-studio-code), or both, if needed.
|
||||
|
||||
### Step 4: Post-creation setup
|
||||
|
||||
Once you're connected to your codespace, automated setup that you specified in your `devcontainer.json` file, such as running the `postCreateCommand` and `postAttachCommand`, may continue. If you have a public dotfiles repository {% data variables.product.prodname_codespaces %}, you can enable it for use with new codespaces. When enabled, your dotfiles will be cloned to the container and look for an install file. For more information, see "[Personalizing {% data variables.product.prodname_codespaces %} for your account](/github/developing-online-with-codespaces/personalizing-codespaces-for-your-account#dotfiles)."
|
||||
Once you are connected to your codespace, your automated setup may continue to build based on the configuration you specified in your `devcontainer.json` file. You may see `postCreateCommand` and `postAttachCommand` run.
|
||||
|
||||
If you want to use Git hooks in your codespace, set up hooks using the [`devcontainer.json` lifecycle scripts](https://code.visualstudio.com/docs/remote/devcontainerjson-reference#_lifecycle-scripts), such as `postCreateCommand`. For more information, see the [`devcontainer.json` reference](https://code.visualstudio.com/docs/remote/devcontainerjson-reference#_devcontainerjson-properties) in the Visual Studio Code documentation.
|
||||
|
||||
If you have a public dotfiles repository for {% data variables.product.prodname_codespaces %}, you can enable it for use with new codespaces. When enabled, your dotfiles will be cloned to the container and the install script will be invoked. For more information, see "[Personalizing {% data variables.product.prodname_codespaces %} for your account](/github/developing-online-with-codespaces/personalizing-codespaces-for-your-account#dotfiles)."
|
||||
|
||||
Finally, the entire history of the repository is copied down with a full clone.
|
||||
|
||||
|
||||
@@ -156,7 +156,7 @@ For more information on enforcing two-factor authentication and allowed IP addre
|
||||
You can centrally manage access to your enterprise's resources, organization membership and team membership using your IdP and SAM single sign-on (SSO). Enterprise owners can enable SAML SSO across all organizations owned by an enterprise account. For more information, see "[About identity and access management for your enterprise](/enterprise-cloud@latest/admin/authentication/managing-identity-and-access-for-your-enterprise/about-identity-and-access-management-for-your-enterprise)."
|
||||
|
||||
#### 3. Managing team synchronization
|
||||
You can enable and manage team sychronization between an identity provider (IdP) and {% data variables.product.prodname_dotcom %} to allow organizations owned by your enterprise account to manage team membership with IdP groups. For more information, see "[Managing team synchronization for organizations in your enterprise account](/enterprise-cloud@latest/admin/authentication/managing-identity-and-access-for-your-enterprise/managing-team-synchronization-for-organizations-in-your-enterprise)."
|
||||
You can enable and manage team synchronization between an identity provider (IdP) and {% data variables.product.prodname_dotcom %} to allow organizations owned by your enterprise account to manage team membership with IdP groups. For more information, see "[Managing team synchronization for organizations in your enterprise account](/enterprise-cloud@latest/admin/authentication/managing-identity-and-access-for-your-enterprise/managing-team-synchronization-for-organizations-in-your-enterprise)."
|
||||
|
||||
#### 4. Enforcing policies for Advanced Security features in your enterprise account
|
||||
{% data reusables.getting-started.enterprise-advanced-security %}
|
||||
|
||||
@@ -16,36 +16,44 @@ topics:
|
||||
shortTitle: SSH certificate authorities
|
||||
---
|
||||
|
||||
An SSH certificate is a mechanism for one SSH key to sign another SSH key. If you use an SSH certificate authority (CA) to provide your organization members with signed SSH certificates, you can add the CA to your enterprise account or organization to allow organization members to use their certificates to access organization resources. For more information, see "[Managing your organization's SSH certificate authorities](/articles/managing-your-organizations-ssh-certificate-authorities)."
|
||||
## About SSH certificate authorities
|
||||
|
||||
After you add an SSH CA to your organization or enterprise account, you can use the CA to sign client SSH certificates for organization members. Organization members can use the signed certificates to access your organization's repositories (and only your organization's repositories) with Git. You can require that members use SSH certificates to access organization resources. For more information, see "[Enforcing policies for security settings in your enterprise](/admin/policies/enforcing-policies-for-your-enterprise/enforcing-policies-for-security-settings-in-your-enterprise#managing-ssh-certificate-authorities-for-your-enterprise)."
|
||||
An SSH certificate is a mechanism for one SSH key to sign another SSH key. If you use an SSH certificate authority (CA) to provide your organization members with signed SSH certificates, you can add the CA to your enterprise account or organization to allow organization members to use their certificates to access organization resources.
|
||||
|
||||
After you add an SSH CA to your organization or enterprise account, you can use the CA to sign client SSH certificates for organization members. Organization members can use the signed certificates to access your organization's repositories (and only your organization's repositories) with Git. Optionally, you can require that members use SSH certificates to access organization resources. For more information, see "[Managing your organization's SSH certificate authorities](/articles/managing-your-organizations-ssh-certificate-authorities)" and "[Enforcing policies for security settings in your enterprise](/admin/policies/enforcing-policies-for-your-enterprise/enforcing-policies-for-security-settings-in-your-enterprise#managing-ssh-certificate-authorities-for-your-enterprise)."
|
||||
|
||||
For example, you can build an internal system that issues a new certificate to your developers every morning. Each developer can use their daily certificate to work on your organization's repositories on {% data variables.product.product_name %}. At the end of the day, the certificate can automatically expire, protecting your repositories if the certificate is later compromised.
|
||||
|
||||
{% ifversion fpt or ghec %}
|
||||
Organization members can use their signed certificates for authentication even if you've enforced SAML single sign-on. Unless you make SSH certificates a requirement, organization members can continue to use other means of authentication to access your organization's resources with Git, including their username and password, personal access tokens, and their own SSH keys.
|
||||
{% endif %}
|
||||
|
||||
Members will not be able to use their certificates to access forks of your repositories that are owned by their user accounts.
|
||||
|
||||
To prevent authentication errors, organization members should use a special URL that includes the organization ID to clone repositories using signed certificates. Anyone with read access to the repository can find this URL on the repository page. For more information, see "[Cloning a repository](/articles/cloning-a-repository)."
|
||||
|
||||
## Issuing certificates
|
||||
|
||||
When you issue each certificate, you must include an extension that specifies which {% data variables.product.product_name %} user the certificate is for. For example, you can use OpenSSH's `ssh-keygen` command, replacing _KEY-IDENTITY_ with your key identity and _USERNAME_ with a {% data variables.product.product_name %} username. The certificate you generate will be authorized to act on behalf of that user for any of your organization's resources. Make sure you validate the user's identity before you issue the certificate.
|
||||
|
||||
```shell
|
||||
$ ssh-keygen -s ./ca-key -I <em>KEY-IDENTITY</em> -O extension:login@{% data variables.product.product_url %}=<em>USERNAME</em> ./user-key.pub
|
||||
$ ssh-keygen -s ./ca-key -V '+1d' -I <em>KEY-IDENTITY</em> -O extension:login@{% data variables.product.product_url %}=<em>USERNAME</em> ./user-key.pub
|
||||
```
|
||||
|
||||
{% warning %}
|
||||
|
||||
**Warning**: After a certificate has been signed and issued, the certificate cannot be revoked. Make sure to use the -`V` flag to configure a lifetime for the certificate, or the certificate can be used indefinitely.
|
||||
|
||||
{% endwarning %}
|
||||
|
||||
To issue a certificate for someone who uses SSH to access multiple {% data variables.product.company_short %} products, you can include two login extensions to specify the username for each product. For example, the following command would issue a certificate for _USERNAME-1_ for the user's account for {% data variables.product.prodname_ghe_cloud %}, and _USERNAME-2_ for the user's account on {% data variables.product.prodname_ghe_managed %} or {% data variables.product.prodname_ghe_server %} at _HOSTNAME_.
|
||||
|
||||
```shell
|
||||
$ ssh-keygen -s ./ca-key -I <em>KEY-IDENTITY</em> -O extension:login@github.com=<em>USERNAME-1</em> extension:login@<em>HOSTNAME</em>=<em>USERNAME-2</em> ./user-key.pub
|
||||
$ ssh-keygen -s ./ca-key -V '+1d' -I <em>KEY-IDENTITY</em> -O extension:login@github.com=<em>USERNAME-1</em> extension:login@<em>HOSTNAME</em>=<em>USERNAME-2</em> ./user-key.pub
|
||||
```
|
||||
|
||||
You can restrict the IP addresses from which an organization member can access your organization's resources by using a `source-address` extension. The extension accepts a specific IP address or a range of IP addresses using CIDR notation. You can specify multiple addresses or ranges by separating the values with commas. For more information, see "[Classless Inter-Domain Routing](https://en.wikipedia.org/wiki/Classless_Inter-Domain_Routing#CIDR_notation)" on Wikipedia.
|
||||
|
||||
```shell
|
||||
$ ssh-keygen -s ./ca-key -I <em>KEY-IDENTITY</em> -O extension:login@{% data variables.product.product_url %}=<em>USERNAME</em> -O source-address=<em>COMMA-SEPARATED-LIST-OF-IP-ADDRESSES-OR-RANGES</em> ./user-key.pub
|
||||
$ ssh-keygen -s ./ca-key -V '+1d' -I <em>KEY-IDENTITY</em> -O extension:login@{% data variables.product.product_url %}=<em>USERNAME</em> -O source-address=<em>COMMA-SEPARATED-LIST-OF-IP-ADDRESSES-OR-RANGES</em> ./user-key.pub
|
||||
```
|
||||
|
||||
{% ifversion fpt or ghec %}
|
||||
|
||||
Organization members can use their signed certificates for authentication even if you've enforced SAML single sign-on. Unless you make SSH certificates a requirement, organization members can continue to use other means of authentication to access your organization's resources with Git, including their username and password, personal access tokens, and their own SSH keys.
|
||||
|
||||
{% endif %}
|
||||
|
||||
To prevent authentication errors, organization members should use a special URL that includes the organization ID to clone repositories using signed certificates. Anyone with read access to the repository can find this URL on the repository page. For more information, see "[Cloning a repository](/articles/cloning-a-repository)."
|
||||
|
||||
@@ -103,7 +103,7 @@ Some of the features listed below are limited to organizations using {% data var
|
||||
| Disable team discussions for an organization (see "[Disabling team discussions for your organization](/articles/disabling-team-discussions-for-your-organization)" for details) | **X** | | | |
|
||||
| Manage viewing of organization dependency insights (see "[Changing the visibility of your organization's dependency insights](/articles/changing-the-visibility-of-your-organizations-dependency-insights)" for details) | **X** | | | |
|
||||
| Set a team profile picture in **all teams** (see "[Setting your team's profile picture](/articles/setting-your-team-s-profile-picture)" for details) | **X** | | | |
|
||||
| Sponsor accounts and manage the organization's sponsorships (see "[Sponsoring open-source contributors](/sponsors/sponsoring-open-source-contributors)" for details) | **X** | **X** | | **X** |
|
||||
| Sponsor accounts and manage the organization's sponsorships (see "[Sponsoring open-source contributors](/sponsors/sponsoring-open-source-contributors)" for details) | **X** | | **X** | **X** |
|
||||
| Manage email updates from sponsored accounts (see "[Managing updates from accounts your organization's sponsors](/organizations/managing-organization-settings/managing-updates-from-accounts-your-organization-sponsors)" for details) | **X** | | | |
|
||||
| Attribute your sponsorships to another organization (see "[Attributing sponsorships to your organization](/sponsors/sponsoring-open-source-contributors/attributing-sponsorships-to-your-organization)" for details ) | **X** | | | |
|
||||
| Manage the publication of {% data variables.product.prodname_pages %} sites from repositories in the organization (see "[Managing the publication of {% data variables.product.prodname_pages %} sites for your organization](/organizations/managing-organization-settings/managing-the-publication-of-github-pages-sites-for-your-organization)" for details) | **X** | | | |
|
||||
|
||||
@@ -20,14 +20,14 @@ shortTitle: Delete & restore a package
|
||||
|
||||
On {% data variables.product.prodname_dotcom %} if you have the required access, you can delete:
|
||||
- an entire private package
|
||||
- an entire public package, if there's not more than 25 downloads of any version of the package
|
||||
- an entire public package, if there's not more than 5000 downloads of any version of the package
|
||||
- a specific version of a private package
|
||||
- a specific version of a public package, if the package version doesn't have more than 25 downloads
|
||||
- a specific version of a public package, if the package version doesn't have more than 5000 downloads
|
||||
|
||||
{% note %}
|
||||
|
||||
**Note:**
|
||||
- You cannot delete a public package if any version of the package has more than 25 downloads. In this scenario, contact [GitHub support](https://support.github.com/contact?tags=docs-packages) for further assistance.
|
||||
- You cannot delete a public package if any version of the package has more than 5000 downloads. In this scenario, contact [GitHub support](https://support.github.com/contact?tags=docs-packages) for further assistance.
|
||||
- When deleting public packages, be aware that you may break projects that depend on your package.
|
||||
|
||||
{% endnote %}
|
||||
|
||||
@@ -55,7 +55,7 @@ For a list of supported custom domains, see "[About custom domains and {% data v
|
||||
|
||||
## HTTPS errors
|
||||
|
||||
{% data variables.product.prodname_pages %} sites using custom domains that are correctly configured with _CNAME_, `ALIAS`, `ANAME`, or `A` DNS records can be accessed over HTTPS. For more information, see "[Securing your {% data variables.product.prodname_pages %} site with HTTPS](/articles/securing-your-github-pages-site-with-https)."
|
||||
{% data variables.product.prodname_pages %} sites using custom domains that are correctly configured with `CNAME`, `ALIAS`, `ANAME`, or `A` DNS records can be accessed over HTTPS. For more information, see "[Securing your {% data variables.product.prodname_pages %} site with HTTPS](/articles/securing-your-github-pages-site-with-https)."
|
||||
|
||||
It can take up to an hour for your site to become available over HTTPS after you configure your custom domain. After you update existing DNS settings, you may need to remove and re-add your custom domain to your site's repository to trigger the process of enabling HTTPS. For more information, see "[Managing a custom domain for your {% data variables.product.prodname_pages %} site](/articles/managing-a-custom-domain-for-your-github-pages-site)."
|
||||
|
||||
|
||||
@@ -51,9 +51,9 @@ When you create a branch rule, the branch you specify doesn't have to exist yet
|
||||
{% data reusables.repositories.repository-branches %}
|
||||
{% data reusables.repositories.add-branch-protection-rules %}
|
||||
1. Optionally, enable required pull request reviews.
|
||||
- Under "Protect matching branches", select **Require pull request reviews before merging**.
|
||||
- Under "Protect matching branches", select **Require a pull request before merging**.
|
||||

|
||||
- Click the **Required approving reviews** drop-down menu, then select the number of approving reviews you'd like to require on the branch.
|
||||
- Click the **Required approving reviews** drop-down menu, then select the number of approving reviews you'd like to require on the branch.
|
||||

|
||||
- Optionally, to dismiss a pull request approval review when a code-modifying commit is pushed to the branch, select **Dismiss stale pull request approvals when new commits are pushed**.
|
||||

|
||||
|
||||
@@ -13,7 +13,7 @@ topics:
|
||||
---
|
||||
## About CITATION files
|
||||
|
||||
You can add a `CITATION.cff` file to the root of a repository to let others know how you would like them to cite your work. The citation file format is plain text with human- and machine-readable citation information.
|
||||
You can add a `CITATION.cff` file to the root of a repository to let others know how you would like them to cite your work. The citation file format is plain text with human- and machine-readable citation information.
|
||||
|
||||
Example `CITATION.cff` file:
|
||||
|
||||
@@ -34,7 +34,7 @@ date-released: 2017-12-18
|
||||
url: "https://github.com/github/linguist"
|
||||
```
|
||||
|
||||
The GitHub citation prompt on your repository will show the example `CITATION.cff` content in these formats:
|
||||
The GitHub citation prompt on your repository will show the example `CITATION.cff` content in these formats:
|
||||
|
||||
**APA**
|
||||
|
||||
@@ -58,7 +58,7 @@ Lisa, M., & Bot, H. (2017). My Research Software (Version 2.0.4) [Computer softw
|
||||
```
|
||||
{% endraw %}
|
||||
|
||||
Note the example above produces a _software_ citation (i.e., `@software` type in BibTeX rather than `@article`).
|
||||
Note the example above produces a _software_ citation (i.e., `@software` type in BibTeX rather than `@article`).
|
||||
|
||||
For more information, see the [Citation File Format](https://citation-file-format.github.io/) website.
|
||||
|
||||
@@ -70,11 +70,21 @@ When you add a `CITATION.cff` file to the default branch of your repository, it
|
||||
|
||||
If you would prefer the {% data variables.product.prodname_dotcom %} citation information to link to another resource such as a research article, then you can use the `preferred-citation` override in CFF with the following types.
|
||||
|
||||
Resource | Type
|
||||
--------- | -----
|
||||
Research article | `article`
|
||||
Conference paper | `conference-paper`
|
||||
Book | `book`
|
||||
| Resource | CFF type | BibTeX type | APA annotation |
|
||||
|----------|----------|-------------|----------------|
|
||||
| Journal article/paper | `article` | `@article` | |
|
||||
| Book | `book` | `@book` | |
|
||||
| Booklet (bound but not published) | `pamphlet` | `@booklet` | |
|
||||
| Conference article/paper | `conference-paper` | `@inproceedings` | [Conference paper] |
|
||||
| Conference proceedings | `conference`, `proceedings` | `@proceedings` | |
|
||||
| Data set | `data`, `database` | `@misc` | [Data set] |
|
||||
| Magazine article | `magazine-article` | `@article` | |
|
||||
| Manual | `manual` | `@manual` | |
|
||||
| Misc/generic/other | `generic`, any other CFF type | `@misc` | |
|
||||
| Newspaper article | `newspaper-article` | `@article` | |
|
||||
| Software | `software`, `software-code`, `software-container`, `software-executable`, `software-virtual-machine` | `@software` | [Computer software] |
|
||||
| Report/technical report | `report` | `@techreport` | |
|
||||
| Unpublished | `unpublished` | `@unpublished` | |
|
||||
|
||||
Extended CITATION.cff file describing the software, but linking to a research article as the preferred citation:
|
||||
|
||||
@@ -113,7 +123,7 @@ preferred-citation:
|
||||
year: 2021
|
||||
```
|
||||
|
||||
The example `CITATION.cff` file above will produce the following outputs in the GitHub citation prompt:
|
||||
The example `CITATION.cff` file above will produce the following outputs in the GitHub citation prompt:
|
||||
|
||||
**APA**
|
||||
|
||||
@@ -141,11 +151,11 @@ Lisa, M., & Bot, H. (2021). My awesome research software. Journal Title, 1(1), 1
|
||||
|
||||
## Citing a dataset
|
||||
|
||||
If your repository contains a dataset, you can set `type: dataset` in your `CITATION.cff` file to produce a data citation string output in the {% data variables.product.prodname_dotcom %} citation prompt.
|
||||
If your repository contains a dataset, you can set `type: dataset` at the top level of your `CITATION.cff` file to produce a data citation string output in the {% data variables.product.prodname_dotcom %} citation prompt.
|
||||
|
||||
## Other citation files
|
||||
|
||||
The GitHub citation feature will also detect a small number of additional files that are often used by communities and projects to describe how they would like their work to be cited.
|
||||
The GitHub citation feature will also detect a small number of additional files that are often used by communities and projects to describe how they would like their work to be cited.
|
||||
|
||||
GitHub will link to these files in the _Cite this repository_ prompt, but will not attempt to parse them into other citation formats.
|
||||
|
||||
@@ -158,7 +168,7 @@ CITATIONS.bib
|
||||
CITATION.md
|
||||
CITATIONS.md
|
||||
|
||||
# CITATION files for R packages are typically found at inst/CITATION
|
||||
# CITATION files for R packages are typically found at inst/CITATION
|
||||
inst/CITATION
|
||||
```
|
||||
|
||||
|
||||
@@ -42,7 +42,7 @@ For code owners to receive review requests, the CODEOWNERS file must be on the b
|
||||
{% ifversion fpt or ghae or ghes > 3.2 or ghec %}
|
||||
## CODEOWNERS file size
|
||||
|
||||
CODEOWNERS files must be under 3 MB in size. A CODEOWNERS file over this limit will not be loaded, which means that code owner information not to be shown and the appropriate code owners will not be requested to review changes in a pull request.
|
||||
CODEOWNERS files must be under 3 MB in size. A CODEOWNERS file over this limit will not be loaded, which means that code owner information is not shown and the appropriate code owners will not be requested to review changes in a pull request.
|
||||
|
||||
To reduce the size of your CODEOWNERS file, consider using wildcard patterns to consolidate multiple entries into a single entry.
|
||||
{% endif %}
|
||||
|
||||
@@ -204,6 +204,12 @@ To link to the same article in a different version, use this format:
|
||||
|
||||
To link to a specific version, you must include the version in the path (e.g., `/enterprise-cloud@latest/admin/overview/about-enterprise-accounts`).
|
||||
|
||||
### Links to learning paths
|
||||
|
||||
Use this format to link to a learning path.
|
||||
|
||||
> For more information, follow the "[LEARNING PATH TITLE]()" learning path.
|
||||
|
||||
### Links to external resources
|
||||
|
||||
When linking to an external site, choose the most useful resource for the context of the link - you can link to a whole site if it's a general reference or to a specific page if that would be more helpful.
|
||||
|
||||
@@ -8,7 +8,7 @@ This site is powered by Node.js! :sparkles: :turtle: :rocket: :sparkles:
|
||||
|
||||
It runs on macOS, Windows, and Linux environments.
|
||||
|
||||
You'll need Node.js version 16 to run the site. To install Node.js, [download the "LTS" installer from nodejs.org](https://nodejs.org). If you're using [`nodenv`](https://github.com/nodenv/nodenv), read the [`nodenv` docs](#nodenv) for instructions on switching Node.js versions.
|
||||
You'll need Node.js version 16 to run the site. To install Node.js, [download the "LTS" installer from nodejs.org](https://nodejs.org). If you're using [`nodenv`](https://github.com/nodenv/nodenv), read the [`nodenv` docs](https://github.com/nodenv/nodenv#readme) for instructions on switching Node.js versions.
|
||||
|
||||
You'll want to [install Git LFS](https://docs.github.com/en/github/managing-large-files/versioning-large-files/installing-git-large-file-storage).
|
||||
|
||||
@@ -24,7 +24,7 @@ npm start
|
||||
|
||||
You should now have a running server! Visit [localhost:4000](http://localhost:4000) in your browser. It will automatically restart as you make changes to site content.
|
||||
|
||||
When you're ready to stop your local server, type <kbd>CTRL</kbd><kbd>c</kbd> in your terminal window.
|
||||
When you're ready to stop your local server, type <kbd>Ctrl</kbd>+<kbd>C</kbd> in your terminal window.
|
||||
|
||||
Note that `npm ci` and `npm run build` are steps that should typically only need to be run once each time you pull the latest for a branch.
|
||||
- `npm ci` does a clean install of dependencies, without updating the `package-lock.json` file
|
||||
@@ -42,7 +42,7 @@ This repo has configuration for debugging with VS Code's built-in Node Debugger.
|
||||
|
||||
1. After running the build steps, start the app by running `npm run debug`.
|
||||
1. In VS Code, click on the Debugging icon in the Activity Bar to bring up the Debug view.
|
||||
1. In the Debug View, select the **'Node: Nodemon'** configuration, then press F5 or click the green play button. You should see all of your running node processes.
|
||||
1. In the Debug View, select the **'Node: Nodemon'** configuration, then press <kbd>F5</kbd> or click the green play button. You should see all of your running node processes.
|
||||
1. Select the node process that's started with the `--inspect` flag.
|
||||
1. Debugger has now been attached. Enjoy!
|
||||
|
||||
@@ -58,11 +58,11 @@ At the `/dev-toc` path, you'll see a list of available versions. Click a version
|
||||
|
||||
By default the local server won't run with all supported languages enabled. If you need to run the server with a particular language, you can temporarily edit the `start` script in `package.json` and update the `ENABLED_LANGUAGES` variable. For example, to enable Japanese and Portuguese, you can set it to `ENABLED_LANGUAGES='en,ja,pt'` and then you need to restart the server for the change to take effect.
|
||||
|
||||
The supported language codes are defined in [lib/lanuages.js](../lib/languages.js).
|
||||
The supported language codes are defined in [lib/languages.js](../lib/languages.js).
|
||||
|
||||
## Site structure
|
||||
|
||||
This site was originally a Ruby on Rails web application. Some time later it was converted into a static site powered by [Jekyll](https://jekyllrb.com/). A few years after that it was migrated to [Nanoc](https://nanoc.ws/), another Ruby static site generator.
|
||||
This site was originally a Ruby on Rails web application. Some time later it was converted into a static site powered by [Jekyll](https://jekyllrb.com/). A few years after that it was migrated to [Nanoc](https://nanoc.app/), another Ruby static site generator.
|
||||
|
||||
Today it's a dynamic Node.js webserver powered by Express, using [middleware](../middleware/README.md) to support proper HTTP redirects, language header detection, and dynamic content generation to support the various flavors of GitHub's product documentation, like GitHub.com and GitHub Enterprise Server.
|
||||
|
||||
|
||||
@@ -1,15 +1,15 @@
|
||||
# Localization Prep Checklist
|
||||
|
||||
Use the following checklist to help make your files more translation-friendly. For additional information, refer to the [style guide](content-style-guide.md).
|
||||
Use the following checklist to help make your files more translation-friendly. For additional information, refer to the [style guide](content-style-guide.md).It gives more detail about how to translate by being abide to localization rules.
|
||||
|
||||
- [ ] Use examples that are generic and can be understood by most people.
|
||||
- [ ] Avoid controversial examples or culturally specific to a group.
|
||||
- [ ] Write in active voice.
|
||||
- [ ] Write simple, short and easy to understand sentences.
|
||||
- [ ] Avoid using too many pronouns that can make text unclear.
|
||||
- [ ] Avoid slang and jokes.
|
||||
- [ ] Avoid using slang and jokes that might be related to any person,place or gender.
|
||||
- [ ] Avoid negative sentences.
|
||||
- [ ] Use industry standard acronyms whenever possible and explain custom acronyms.
|
||||
- [ ] Use industry standard acronyms wherever possible and explain custom acronyms.
|
||||
- [ ] Use indicative mood.
|
||||
- [ ] Eliminate redundant and wordy expressions.
|
||||
- [ ] Avoid the excessive use of stacked modifiers (noun strings). The translator can misunderstand which one is the noun being modified.
|
||||
@@ -18,6 +18,7 @@ Use the following checklist to help make your files more translation-friendly. F
|
||||
- [ ] Avoid using ambiguous modal auxiliary verbs.
|
||||
- [ ] Avoid gender-specific words.
|
||||
- [ ] Avoid prepositional phrases.
|
||||
- [ ] Avoid using passive voice to refer the relavant topic.
|
||||
- [ ] Avoid vague nouns and pronouns (vague sentence subject).
|
||||
- [ ] Keep inline links to a minimum. If they are necessary, preface them with a phrase such as "For more information, see "Link title". Alternatively, add relevant links to a "Further reading" section at the end of the topic.
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
When you select the **Rebase and merge** option on a pull request on {% data variables.product.product_location %}, all commits from the topic branch (or head branch) are added onto the base branch individually without a merge commit. Pull requests with rebased commits are merged using the [fast-forward option](https://git-scm.com/docs/git-merge#_fast_forward_merge).
|
||||
When you select the **Rebase and merge** option on a pull request on {% data variables.product.product_location %}, all commits from the topic branch (or head branch) are added onto the base branch individually without a merge commit. In that way, the rebase and merge behavior resembles a [fast-forward merge](https://git-scm.com/docs/git-merge#_fast_forward_merge) by maintaining a linear project history. However, rebasing achieves this by re-writing the commit history on the base branch with new commits.
|
||||
|
||||
The rebase and merge behavior on {% data variables.product.product_name %} deviates slightly from `git rebase`. Rebase and merge on {% data variables.product.prodname_dotcom %} will always update the committer information and create new commit SHAs, whereas `git rebase` outside of {% data variables.product.prodname_dotcom %} does not change the committer information when the rebase happens on top of an ancestor commit. For more information about `git rebase`, see [git-rebase](https://git-scm.com/docs/git-rebase) in the Git documentation.
|
||||
|
||||
To rebase and merge pull requests, you must have [write permissions](/articles/repository-permission-levels-for-an-organization/) in the repository, and the repository must [allow rebase merging](/articles/configuring-commit-rebasing-for-pull-requests/).
|
||||
|
||||
The rebase and merge behavior on {% data variables.product.product_name %} deviates slightly from `git rebase`. Rebase and merge on {% data variables.product.prodname_dotcom %} will always update the committer information and create new commit SHAs, whereas `git rebase` outside of {% data variables.product.prodname_dotcom %} does not change the committer information when the rebase happens on top of an ancestor commit. For more information about `git rebase`, see [the official Git documentation](https://git-scm.com/docs/git-rebase).
|
||||
|
||||
For a visual representation of `git rebase`, see [The "Git Branching - Rebasing" chapter from the _Pro Git_ book](https://git-scm.com/book/en/Git-Branching-Rebasing).
|
||||
|
||||
@@ -1,2 +1,2 @@
|
||||
1. In your user settings sidebar, click **Blocked users**.
|
||||
1. In your user settings sidebar, click **Blocked users** under **Moderation settings**.
|
||||

|
||||
|
||||
@@ -3,6 +3,8 @@ import fetch from 'node-fetch'
|
||||
import statsd from '../lib/statsd.js'
|
||||
import FailBot from '../lib/failbot.js'
|
||||
|
||||
const TIME_OUT_TEXT = 'ms has passed since batch creation'
|
||||
|
||||
export default class Hydro {
|
||||
constructor({ secret, endpoint } = {}) {
|
||||
this.secret = secret || process.env.HYDRO_SECRET
|
||||
@@ -67,6 +69,9 @@ export default class Hydro {
|
||||
|
||||
const failures = await res.text()
|
||||
|
||||
// If Hydro just took too long, ignore it
|
||||
if (failures.includes(TIME_OUT_TEXT)) throw new Error(`Hydro timed out (${failures})`)
|
||||
|
||||
FailBot.report(err, {
|
||||
hydroStatus: res.status,
|
||||
hydroText: res.statusText,
|
||||
|
||||
87
lib/liquid-tags/tokens.js
Normal file
87
lib/liquid-tags/tokens.js
Normal file
@@ -0,0 +1,87 @@
|
||||
import walk from 'walk-sync'
|
||||
import { Tokenizer } from 'liquidjs'
|
||||
import { readFileSync } from 'fs'
|
||||
import gitDiff from 'git-diff'
|
||||
import _ from 'lodash'
|
||||
|
||||
function getGitDiff(a, b) {
|
||||
return gitDiff(a, b, { flags: '--ignore-all-space' })
|
||||
}
|
||||
|
||||
function getMissingLines(diff) {
|
||||
return diff
|
||||
.split('\n')
|
||||
.filter((line) => line.startsWith('-'))
|
||||
.map((line) => line.replace('-', ''))
|
||||
}
|
||||
|
||||
function getExceedingLines(diff) {
|
||||
return diff
|
||||
.split('\n')
|
||||
.filter((line) => line.startsWith('+'))
|
||||
.map((line) => line.replace('+', ''))
|
||||
}
|
||||
|
||||
export function languageFiles(language, folder = 'content') {
|
||||
const englishFiles = walk(folder, { directories: false })
|
||||
const languageFiles = walk(`${language.dir}/${folder}`, { directories: false })
|
||||
return _.intersection(englishFiles, languageFiles).map((file) => `${folder}/${file}`)
|
||||
}
|
||||
|
||||
export function compareLiquidTags(file, language) {
|
||||
const translation = `${language.dir}/${file}`
|
||||
const sourceTokens = getTokensFromFile(file).rejectType('html')
|
||||
const otherFileTokens = getTokensFromFile(translation).rejectType('html')
|
||||
const diff = sourceTokens.diff(otherFileTokens)
|
||||
|
||||
return {
|
||||
file,
|
||||
translation,
|
||||
diff,
|
||||
}
|
||||
}
|
||||
|
||||
function getTokens(contents) {
|
||||
const tokenizer = new Tokenizer(contents)
|
||||
return new Tokens(...tokenizer.readTopLevelTokens())
|
||||
}
|
||||
|
||||
export function getTokensFromFile(filePath) {
|
||||
const contents = readFileSync(filePath, 'utf8')
|
||||
try {
|
||||
return new Tokens(...getTokens(contents))
|
||||
} catch (e) {
|
||||
const error = new Error(`Error parsing ${filePath}: ${e.message}`)
|
||||
error.filePath = filePath
|
||||
throw error
|
||||
}
|
||||
}
|
||||
|
||||
export class Tokens extends Array {
|
||||
rejectType(tagType) {
|
||||
return this.filter(
|
||||
(token) => token.constructor.name.toUpperCase() !== `${tagType}Token`.toUpperCase()
|
||||
)
|
||||
}
|
||||
|
||||
onlyText() {
|
||||
return this.map((token) => token.getText())
|
||||
}
|
||||
|
||||
diff(otherTokens) {
|
||||
const a = this.onlyText()
|
||||
const b = otherTokens.onlyText()
|
||||
|
||||
const diff = getGitDiff(a.join('\n'), b.join('\n'))
|
||||
|
||||
if (!diff) {
|
||||
return { count: 0, missing: [], exceeding: [], output: '' }
|
||||
}
|
||||
|
||||
const missing = getMissingLines(diff)
|
||||
const exceeding = getExceedingLines(diff)
|
||||
const count = exceeding.length + missing.length
|
||||
|
||||
return { count, missing, exceeding, output: diff }
|
||||
}
|
||||
}
|
||||
@@ -48658,7 +48658,7 @@
|
||||
}
|
||||
],
|
||||
"summary": "Delete a code scanning analysis from a repository",
|
||||
"description": "Deletes a specified code scanning analysis from a repository. For\nprivate repositories, you must use an access token with the `repo` scope. For public repositories,\nyou must use an access token with `public_repo` and `repo:security_events` scopes.\nGitHub Apps must have the `security_events` write permission to use this endpoint.\n\nYou can delete one analysis at a time.\nTo delete a series of analyses, start with the most recent analysis and work backwards.\nConceptually, the process is similar to the undo function in a text editor.\n\nWhen you list the analyses for a repository,\none or more will be identified as deletable in the response:\n\n```\n\"deletable\": true\n```\n\nAn analysis is deletable when it's the most recent in a set of analyses.\nTypically, a repository will have multiple sets of analyses\nfor each enabled code scanning tool,\nwhere a set is determined by a unique combination of analysis values:\n\n* `ref`\n* `tool`\n* `analysis_key`\n* `environment`\n\nIf you attempt to delete an analysis that is not the most recent in a set,\nyou'll get a 400 response with the message:\n\n```\nAnalysis specified is not deletable.\n```\n\nThe response from a successful `DELETE` operation provides you with\ntwo alternative URLs for deleting the next analysis in the set\n(see the example default response below).\nUse the `next_analysis_url` URL if you want to avoid accidentally deleting the final analysis\nin the set. This is a useful option if you want to preserve at least one analysis\nfor the specified tool in your repository.\nUse the `confirm_delete_url` URL if you are content to remove all analyses for a tool.\nWhen you delete the last analysis in a set the value of `next_analysis_url` and `confirm_delete_url`\nin the 200 response is `null`.\n\nAs an example of the deletion process,\nlet's imagine that you added a workflow that configured a particular code scanning tool\nto analyze the code in a repository. This tool has added 15 analyses:\n10 on the default branch, and another 5 on a topic branch.\nYou therefore have two separate sets of analyses for this tool.\nYou've now decided that you want to remove all of the analyses for the tool.\nTo do this you must make 15 separate deletion requests.\nTo start, you must find the deletable analysis for one of the sets,\nstep through deleting the analyses in that set,\nand then repeat the process for the second set.\nThe procedure therefore consists of a nested loop:\n\n**Outer loop**:\n* List the analyses for the repository, filtered by tool.\n* Parse this list to find a deletable analysis. If found:\n\n **Inner loop**:\n * Delete the identified analysis.\n * Parse the response for the value of `confirm_delete_url` and, if found, use this in the next iteration.\n\nThe above process assumes that you want to remove all trace of the tool's analyses from the GitHub user interface, for the specified repository, and it therefore uses the `confirm_delete_url` value. Alternatively, you could use the `next_analysis_url` value, which would leave the last analysis in each set undeleted to avoid removing a tool's analysis entirely.",
|
||||
"description": "Deletes a specified code scanning analysis from a repository. For\nprivate repositories, you must use an access token with the `repo` scope. For public repositories,\nyou must use an access token with `public_repo` and `repo:security_events` scopes.\nGitHub Apps must have the `security_events` write permission to use this endpoint.\n\nYou can delete one analysis at a time.\nTo delete a series of analyses, start with the most recent analysis and work backwards.\nConceptually, the process is similar to the undo function in a text editor.\n\nWhen you list the analyses for a repository,\none or more will be identified as deletable in the response:\n\n```\n\"deletable\": true\n```\n\nAn analysis is deletable when it's the most recent in a set of analyses.\nTypically, a repository will have multiple sets of analyses\nfor each enabled code scanning tool,\nwhere a set is determined by a unique combination of analysis values:\n\n* `ref`\n* `tool`\n* `analysis_key`\n* `environment`\n\nIf you attempt to delete an analysis that is not the most recent in a set,\nyou'll get a 400 response with the message:\n\n```\nAnalysis specified is not deletable.\n```\n\nThe response from a successful `DELETE` operation provides you with\ntwo alternative URLs for deleting the next analysis in the set:\n`next_analysis_url` and `confirm_delete_url`.\nUse the `next_analysis_url` URL if you want to avoid accidentally deleting the final analysis\nin a set. This is a useful option if you want to preserve at least one analysis\nfor the specified tool in your repository.\nUse the `confirm_delete_url` URL if you are content to remove all analyses for a tool.\nWhen you delete the last analysis in a set, the value of `next_analysis_url` and `confirm_delete_url`\nin the 200 response is `null`.\n\nAs an example of the deletion process,\nlet's imagine that you added a workflow that configured a particular code scanning tool\nto analyze the code in a repository. This tool has added 15 analyses:\n10 on the default branch, and another 5 on a topic branch.\nYou therefore have two separate sets of analyses for this tool.\nYou've now decided that you want to remove all of the analyses for the tool.\nTo do this you must make 15 separate deletion requests.\nTo start, you must find an analysis that's identified as deletable.\nEach set of analyses always has one that's identified as deletable.\nHaving found the deletable analysis for one of the two sets,\ndelete this analysis and then continue deleting the next analysis in the set until they're all deleted.\nThen repeat the process for the second set.\nThe procedure therefore consists of a nested loop:\n\n**Outer loop**:\n* List the analyses for the repository, filtered by tool.\n* Parse this list to find a deletable analysis. If found:\n\n **Inner loop**:\n * Delete the identified analysis.\n * Parse the response for the value of `confirm_delete_url` and, if found, use this in the next iteration.\n\nThe above process assumes that you want to remove all trace of the tool's analyses from the GitHub user interface, for the specified repository, and it therefore uses the `confirm_delete_url` value. Alternatively, you could use the `next_analysis_url` value, which would leave the last analysis in each set undeleted to avoid removing a tool's analysis entirely.",
|
||||
"operationId": "code-scanning/delete-analysis",
|
||||
"tags": [
|
||||
"code-scanning"
|
||||
@@ -48677,7 +48677,7 @@
|
||||
"categoryLabel": "Code scanning",
|
||||
"notes": [],
|
||||
"bodyParameters": [],
|
||||
"descriptionHTML": "<p>Deletes a specified code scanning analysis from a repository. For\nprivate repositories, you must use an access token with the <code>repo</code> scope. For public repositories,\nyou must use an access token with <code>public_repo</code> and <code>repo:security_events</code> scopes.\nGitHub Apps must have the <code>security_events</code> write permission to use this endpoint.</p>\n<p>You can delete one analysis at a time.\nTo delete a series of analyses, start with the most recent analysis and work backwards.\nConceptually, the process is similar to the undo function in a text editor.</p>\n<p>When you list the analyses for a repository,\none or more will be identified as deletable in the response:</p>\n<pre><code>\"deletable\": true\n</code></pre>\n<p>An analysis is deletable when it's the most recent in a set of analyses.\nTypically, a repository will have multiple sets of analyses\nfor each enabled code scanning tool,\nwhere a set is determined by a unique combination of analysis values:</p>\n<ul>\n<li><code>ref</code></li>\n<li><code>tool</code></li>\n<li><code>analysis_key</code></li>\n<li><code>environment</code></li>\n</ul>\n<p>If you attempt to delete an analysis that is not the most recent in a set,\nyou'll get a 400 response with the message:</p>\n<pre><code>Analysis specified is not deletable.\n</code></pre>\n<p>The response from a successful <code>DELETE</code> operation provides you with\ntwo alternative URLs for deleting the next analysis in the set\n(see the example default response below).\nUse the <code>next_analysis_url</code> URL if you want to avoid accidentally deleting the final analysis\nin the set. This is a useful option if you want to preserve at least one analysis\nfor the specified tool in your repository.\nUse the <code>confirm_delete_url</code> URL if you are content to remove all analyses for a tool.\nWhen you delete the last analysis in a set the value of <code>next_analysis_url</code> and <code>confirm_delete_url</code>\nin the 200 response is <code>null</code>.</p>\n<p>As an example of the deletion process,\nlet's imagine that you added a workflow that configured a particular code scanning tool\nto analyze the code in a repository. This tool has added 15 analyses:\n10 on the default branch, and another 5 on a topic branch.\nYou therefore have two separate sets of analyses for this tool.\nYou've now decided that you want to remove all of the analyses for the tool.\nTo do this you must make 15 separate deletion requests.\nTo start, you must find the deletable analysis for one of the sets,\nstep through deleting the analyses in that set,\nand then repeat the process for the second set.\nThe procedure therefore consists of a nested loop:</p>\n<p><strong>Outer loop</strong>:</p>\n<ul>\n<li>\n<p>List the analyses for the repository, filtered by tool.</p>\n</li>\n<li>\n<p>Parse this list to find a deletable analysis. If found:</p>\n<p><strong>Inner loop</strong>:</p>\n<ul>\n<li>Delete the identified analysis.</li>\n<li>Parse the response for the value of <code>confirm_delete_url</code> and, if found, use this in the next iteration.</li>\n</ul>\n</li>\n</ul>\n<p>The above process assumes that you want to remove all trace of the tool's analyses from the GitHub user interface, for the specified repository, and it therefore uses the <code>confirm_delete_url</code> value. Alternatively, you could use the <code>next_analysis_url</code> value, which would leave the last analysis in each set undeleted to avoid removing a tool's analysis entirely.</p>",
|
||||
"descriptionHTML": "<p>Deletes a specified code scanning analysis from a repository. For\nprivate repositories, you must use an access token with the <code>repo</code> scope. For public repositories,\nyou must use an access token with <code>public_repo</code> and <code>repo:security_events</code> scopes.\nGitHub Apps must have the <code>security_events</code> write permission to use this endpoint.</p>\n<p>You can delete one analysis at a time.\nTo delete a series of analyses, start with the most recent analysis and work backwards.\nConceptually, the process is similar to the undo function in a text editor.</p>\n<p>When you list the analyses for a repository,\none or more will be identified as deletable in the response:</p>\n<pre><code>\"deletable\": true\n</code></pre>\n<p>An analysis is deletable when it's the most recent in a set of analyses.\nTypically, a repository will have multiple sets of analyses\nfor each enabled code scanning tool,\nwhere a set is determined by a unique combination of analysis values:</p>\n<ul>\n<li><code>ref</code></li>\n<li><code>tool</code></li>\n<li><code>analysis_key</code></li>\n<li><code>environment</code></li>\n</ul>\n<p>If you attempt to delete an analysis that is not the most recent in a set,\nyou'll get a 400 response with the message:</p>\n<pre><code>Analysis specified is not deletable.\n</code></pre>\n<p>The response from a successful <code>DELETE</code> operation provides you with\ntwo alternative URLs for deleting the next analysis in the set:\n<code>next_analysis_url</code> and <code>confirm_delete_url</code>.\nUse the <code>next_analysis_url</code> URL if you want to avoid accidentally deleting the final analysis\nin a set. This is a useful option if you want to preserve at least one analysis\nfor the specified tool in your repository.\nUse the <code>confirm_delete_url</code> URL if you are content to remove all analyses for a tool.\nWhen you delete the last analysis in a set, the value of <code>next_analysis_url</code> and <code>confirm_delete_url</code>\nin the 200 response is <code>null</code>.</p>\n<p>As an example of the deletion process,\nlet's imagine that you added a workflow that configured a particular code scanning tool\nto analyze the code in a repository. This tool has added 15 analyses:\n10 on the default branch, and another 5 on a topic branch.\nYou therefore have two separate sets of analyses for this tool.\nYou've now decided that you want to remove all of the analyses for the tool.\nTo do this you must make 15 separate deletion requests.\nTo start, you must find an analysis that's identified as deletable.\nEach set of analyses always has one that's identified as deletable.\nHaving found the deletable analysis for one of the two sets,\ndelete this analysis and then continue deleting the next analysis in the set until they're all deleted.\nThen repeat the process for the second set.\nThe procedure therefore consists of a nested loop:</p>\n<p><strong>Outer loop</strong>:</p>\n<ul>\n<li>\n<p>List the analyses for the repository, filtered by tool.</p>\n</li>\n<li>\n<p>Parse this list to find a deletable analysis. If found:</p>\n<p><strong>Inner loop</strong>:</p>\n<ul>\n<li>Delete the identified analysis.</li>\n<li>Parse the response for the value of <code>confirm_delete_url</code> and, if found, use this in the next iteration.</li>\n</ul>\n</li>\n</ul>\n<p>The above process assumes that you want to remove all trace of the tool's analyses from the GitHub user interface, for the specified repository, and it therefore uses the <code>confirm_delete_url</code> value. Alternatively, you could use the <code>next_analysis_url</code> value, which would leave the last analysis in each set undeleted to avoid removing a tool's analysis entirely.</p>",
|
||||
"responses": [
|
||||
{
|
||||
"httpStatusCode": "200",
|
||||
@@ -79686,13 +79686,13 @@
|
||||
"x-codeSamples": [
|
||||
{
|
||||
"lang": "Shell",
|
||||
"source": "curl \\\n -X PATCH \\\n -H \"Accept: application/vnd.github.v3+json\" \\\n https://api.github.com/scim/v2/enterprises/ENTERPRISE/Groups/SCIM_GROUP_ID \\\n -d '{\"schemas\":[\"schemas\"],\"Operations\":[{\"op\":\"op\",\"path\":\"path\",\"value\":\"value\"}]}'",
|
||||
"html": "<pre><code class=\"hljs language-shell\">curl \\\n -X PATCH \\\n -H \"Accept: application/vnd.github.v3+json\" \\\n https://api.github.com/scim/v2/enterprises/ENTERPRISE/Groups/SCIM_GROUP_ID \\\n -d '{\"schemas\":[\"schemas\"],\"Operations\":[{\"op\":\"op\",\"path\":\"path\",\"value\":\"value\"}]}'</code></pre>"
|
||||
"source": "curl \\\n -X PATCH \\\n -H \"Accept: application/vnd.github.v3+json\" \\\n https://api.github.com/scim/v2/enterprises/ENTERPRISE/Groups/SCIM_GROUP_ID \\\n -d '{\"schemas\":[\"schemas\"],\"Operations\":[{\"op\":\"op\",\"path\":\"path\",\"value\":\"any\"}]}'",
|
||||
"html": "<pre><code class=\"hljs language-shell\">curl \\\n -X PATCH \\\n -H \"Accept: application/vnd.github.v3+json\" \\\n https://api.github.com/scim/v2/enterprises/ENTERPRISE/Groups/SCIM_GROUP_ID \\\n -d '{\"schemas\":[\"schemas\"],\"Operations\":[{\"op\":\"op\",\"path\":\"path\",\"value\":\"any\"}]}'</code></pre>"
|
||||
},
|
||||
{
|
||||
"lang": "JavaScript",
|
||||
"source": "await octokit.request('PATCH /scim/v2/enterprises/{enterprise}/Groups/{scim_group_id}', {\n enterprise: 'enterprise',\n scim_group_id: 'scim_group_id',\n schemas: [\n 'schemas'\n ],\n Operations: [\n {\n op: 'op',\n path: 'path',\n value: 'value'\n }\n ]\n})",
|
||||
"html": "<pre><code class=\"hljs language-javascript\"><span class=\"hljs-keyword\">await</span> octokit.<span class=\"hljs-title hljs-function\">request</span>(<span class=\"hljs-string\">'PATCH /scim/v2/enterprises/{enterprise}/Groups/{scim_group_id}'</span>, {\n <span class=\"hljs-attr\">enterprise</span>: <span class=\"hljs-string\">'enterprise'</span>,\n <span class=\"hljs-attr\">scim_group_id</span>: <span class=\"hljs-string\">'scim_group_id'</span>,\n <span class=\"hljs-attr\">schemas</span>: [\n <span class=\"hljs-string\">'schemas'</span>\n ],\n <span class=\"hljs-title hljs-class\">Operations</span>: [\n {\n <span class=\"hljs-attr\">op</span>: <span class=\"hljs-string\">'op'</span>,\n <span class=\"hljs-attr\">path</span>: <span class=\"hljs-string\">'path'</span>,\n <span class=\"hljs-attr\">value</span>: <span class=\"hljs-string\">'value'</span>\n }\n ]\n})\n</code></pre>"
|
||||
"source": "await octokit.request('PATCH /scim/v2/enterprises/{enterprise}/Groups/{scim_group_id}', {\n enterprise: 'enterprise',\n scim_group_id: 'scim_group_id',\n schemas: [\n 'schemas'\n ],\n Operations: [\n {\n op: 'op',\n path: 'path',\n value: 'any'\n }\n ]\n})",
|
||||
"html": "<pre><code class=\"hljs language-javascript\"><span class=\"hljs-keyword\">await</span> octokit.<span class=\"hljs-title hljs-function\">request</span>(<span class=\"hljs-string\">'PATCH /scim/v2/enterprises/{enterprise}/Groups/{scim_group_id}'</span>, {\n <span class=\"hljs-attr\">enterprise</span>: <span class=\"hljs-string\">'enterprise'</span>,\n <span class=\"hljs-attr\">scim_group_id</span>: <span class=\"hljs-string\">'scim_group_id'</span>,\n <span class=\"hljs-attr\">schemas</span>: [\n <span class=\"hljs-string\">'schemas'</span>\n ],\n <span class=\"hljs-title hljs-class\">Operations</span>: [\n {\n <span class=\"hljs-attr\">op</span>: <span class=\"hljs-string\">'op'</span>,\n <span class=\"hljs-attr\">path</span>: <span class=\"hljs-string\">'path'</span>,\n <span class=\"hljs-attr\">value</span>: <span class=\"hljs-string\">'any'</span>\n }\n ]\n})\n</code></pre>"
|
||||
}
|
||||
],
|
||||
"summary": "Update an attribute for a SCIM enterprise group",
|
||||
@@ -79756,21 +79756,11 @@
|
||||
"childParamsGroups": []
|
||||
},
|
||||
"value": {
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "string"
|
||||
},
|
||||
{
|
||||
"type": "object"
|
||||
},
|
||||
{
|
||||
"type": "array"
|
||||
}
|
||||
],
|
||||
"description": "<p>Can be any value - string, number, array or object.</p>",
|
||||
"name": "value",
|
||||
"in": "body",
|
||||
"type": "string or object or array",
|
||||
"description": "",
|
||||
"rawDescription": "Can be any value - string, number, array or object.",
|
||||
"type": "",
|
||||
"childParamsGroups": []
|
||||
}
|
||||
},
|
||||
@@ -79813,21 +79803,11 @@
|
||||
"childParamsGroups": []
|
||||
},
|
||||
{
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "string"
|
||||
},
|
||||
{
|
||||
"type": "object"
|
||||
},
|
||||
{
|
||||
"type": "array"
|
||||
}
|
||||
],
|
||||
"description": "<p>Can be any value - string, number, array or object.</p>",
|
||||
"name": "value",
|
||||
"in": "body",
|
||||
"type": "string or object or array",
|
||||
"description": "",
|
||||
"rawDescription": "Can be any value - string, number, array or object.",
|
||||
"type": "",
|
||||
"childParamsGroups": []
|
||||
}
|
||||
]
|
||||
@@ -79926,21 +79906,11 @@
|
||||
"childParamsGroups": []
|
||||
},
|
||||
"value": {
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "string"
|
||||
},
|
||||
{
|
||||
"type": "object"
|
||||
},
|
||||
{
|
||||
"type": "array"
|
||||
}
|
||||
],
|
||||
"description": "<p>Can be any value - string, number, array or object.</p>",
|
||||
"name": "value",
|
||||
"in": "body",
|
||||
"type": "string or object or array",
|
||||
"description": "",
|
||||
"rawDescription": "Can be any value - string, number, array or object.",
|
||||
"type": "",
|
||||
"childParamsGroups": []
|
||||
}
|
||||
},
|
||||
@@ -79983,21 +79953,11 @@
|
||||
"childParamsGroups": []
|
||||
},
|
||||
{
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "string"
|
||||
},
|
||||
{
|
||||
"type": "object"
|
||||
},
|
||||
{
|
||||
"type": "array"
|
||||
}
|
||||
],
|
||||
"description": "<p>Can be any value - string, number, array or object.</p>",
|
||||
"name": "value",
|
||||
"in": "body",
|
||||
"type": "string or object or array",
|
||||
"description": "",
|
||||
"rawDescription": "Can be any value - string, number, array or object.",
|
||||
"type": "",
|
||||
"childParamsGroups": []
|
||||
}
|
||||
]
|
||||
@@ -88678,6 +88638,15 @@
|
||||
"default": 1
|
||||
},
|
||||
"descriptionHTML": "<p>Page number of the results to fetch.</p>"
|
||||
},
|
||||
{
|
||||
"name": "repository_id",
|
||||
"description": "ID of the Repository to filter on",
|
||||
"in": "query",
|
||||
"schema": {
|
||||
"type": "integer"
|
||||
},
|
||||
"descriptionHTML": "<p>ID of the Repository to filter on</p>"
|
||||
}
|
||||
],
|
||||
"x-codeSamples": [
|
||||
|
||||
@@ -67003,13 +67003,13 @@
|
||||
"x-codeSamples": [
|
||||
{
|
||||
"lang": "Shell",
|
||||
"source": "curl \\\n -X PATCH \\\n -H \"Accept: application/vnd.github.v3+json\" \\\n https://{hostname}/api/v3/scim/v2/enterprises/ENTERPRISE/Groups/SCIM_GROUP_ID \\\n -d '{\"schemas\":[\"schemas\"],\"Operations\":[{\"op\":\"op\",\"path\":\"path\",\"value\":\"value\"}]}'",
|
||||
"html": "<pre><code class=\"hljs language-shell\">curl \\\n -X PATCH \\\n -H \"Accept: application/vnd.github.v3+json\" \\\n https://{hostname}/api/v3/scim/v2/enterprises/ENTERPRISE/Groups/SCIM_GROUP_ID \\\n -d '{\"schemas\":[\"schemas\"],\"Operations\":[{\"op\":\"op\",\"path\":\"path\",\"value\":\"value\"}]}'</code></pre>"
|
||||
"source": "curl \\\n -X PATCH \\\n -H \"Accept: application/vnd.github.v3+json\" \\\n https://{hostname}/api/v3/scim/v2/enterprises/ENTERPRISE/Groups/SCIM_GROUP_ID \\\n -d '{\"schemas\":[\"schemas\"],\"Operations\":[{\"op\":\"op\",\"path\":\"path\",\"value\":\"any\"}]}'",
|
||||
"html": "<pre><code class=\"hljs language-shell\">curl \\\n -X PATCH \\\n -H \"Accept: application/vnd.github.v3+json\" \\\n https://{hostname}/api/v3/scim/v2/enterprises/ENTERPRISE/Groups/SCIM_GROUP_ID \\\n -d '{\"schemas\":[\"schemas\"],\"Operations\":[{\"op\":\"op\",\"path\":\"path\",\"value\":\"any\"}]}'</code></pre>"
|
||||
},
|
||||
{
|
||||
"lang": "JavaScript",
|
||||
"source": "await octokit.request('PATCH /scim/v2/enterprises/{enterprise}/Groups/{scim_group_id}', {\n enterprise: 'enterprise',\n scim_group_id: 'scim_group_id',\n schemas: [\n 'schemas'\n ],\n Operations: [\n {\n op: 'op',\n path: 'path',\n value: 'value'\n }\n ]\n})",
|
||||
"html": "<pre><code class=\"hljs language-javascript\"><span class=\"hljs-keyword\">await</span> octokit.<span class=\"hljs-title hljs-function\">request</span>(<span class=\"hljs-string\">'PATCH /scim/v2/enterprises/{enterprise}/Groups/{scim_group_id}'</span>, {\n <span class=\"hljs-attr\">enterprise</span>: <span class=\"hljs-string\">'enterprise'</span>,\n <span class=\"hljs-attr\">scim_group_id</span>: <span class=\"hljs-string\">'scim_group_id'</span>,\n <span class=\"hljs-attr\">schemas</span>: [\n <span class=\"hljs-string\">'schemas'</span>\n ],\n <span class=\"hljs-title hljs-class\">Operations</span>: [\n {\n <span class=\"hljs-attr\">op</span>: <span class=\"hljs-string\">'op'</span>,\n <span class=\"hljs-attr\">path</span>: <span class=\"hljs-string\">'path'</span>,\n <span class=\"hljs-attr\">value</span>: <span class=\"hljs-string\">'value'</span>\n }\n ]\n})\n</code></pre>"
|
||||
"source": "await octokit.request('PATCH /scim/v2/enterprises/{enterprise}/Groups/{scim_group_id}', {\n enterprise: 'enterprise',\n scim_group_id: 'scim_group_id',\n schemas: [\n 'schemas'\n ],\n Operations: [\n {\n op: 'op',\n path: 'path',\n value: 'any'\n }\n ]\n})",
|
||||
"html": "<pre><code class=\"hljs language-javascript\"><span class=\"hljs-keyword\">await</span> octokit.<span class=\"hljs-title hljs-function\">request</span>(<span class=\"hljs-string\">'PATCH /scim/v2/enterprises/{enterprise}/Groups/{scim_group_id}'</span>, {\n <span class=\"hljs-attr\">enterprise</span>: <span class=\"hljs-string\">'enterprise'</span>,\n <span class=\"hljs-attr\">scim_group_id</span>: <span class=\"hljs-string\">'scim_group_id'</span>,\n <span class=\"hljs-attr\">schemas</span>: [\n <span class=\"hljs-string\">'schemas'</span>\n ],\n <span class=\"hljs-title hljs-class\">Operations</span>: [\n {\n <span class=\"hljs-attr\">op</span>: <span class=\"hljs-string\">'op'</span>,\n <span class=\"hljs-attr\">path</span>: <span class=\"hljs-string\">'path'</span>,\n <span class=\"hljs-attr\">value</span>: <span class=\"hljs-string\">'any'</span>\n }\n ]\n})\n</code></pre>"
|
||||
}
|
||||
],
|
||||
"summary": "Update an attribute for a SCIM enterprise group",
|
||||
@@ -67073,21 +67073,11 @@
|
||||
"childParamsGroups": []
|
||||
},
|
||||
"value": {
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "string"
|
||||
},
|
||||
{
|
||||
"type": "object"
|
||||
},
|
||||
{
|
||||
"type": "array"
|
||||
}
|
||||
],
|
||||
"description": "<p>Can be any value - string, number, array or object.</p>",
|
||||
"name": "value",
|
||||
"in": "body",
|
||||
"type": "string or object or array",
|
||||
"description": "",
|
||||
"rawDescription": "Can be any value - string, number, array or object.",
|
||||
"type": "",
|
||||
"childParamsGroups": []
|
||||
}
|
||||
},
|
||||
@@ -67130,21 +67120,11 @@
|
||||
"childParamsGroups": []
|
||||
},
|
||||
{
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "string"
|
||||
},
|
||||
{
|
||||
"type": "object"
|
||||
},
|
||||
{
|
||||
"type": "array"
|
||||
}
|
||||
],
|
||||
"description": "<p>Can be any value - string, number, array or object.</p>",
|
||||
"name": "value",
|
||||
"in": "body",
|
||||
"type": "string or object or array",
|
||||
"description": "",
|
||||
"rawDescription": "Can be any value - string, number, array or object.",
|
||||
"type": "",
|
||||
"childParamsGroups": []
|
||||
}
|
||||
]
|
||||
@@ -67243,21 +67223,11 @@
|
||||
"childParamsGroups": []
|
||||
},
|
||||
"value": {
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "string"
|
||||
},
|
||||
{
|
||||
"type": "object"
|
||||
},
|
||||
{
|
||||
"type": "array"
|
||||
}
|
||||
],
|
||||
"description": "<p>Can be any value - string, number, array or object.</p>",
|
||||
"name": "value",
|
||||
"in": "body",
|
||||
"type": "string or object or array",
|
||||
"description": "",
|
||||
"rawDescription": "Can be any value - string, number, array or object.",
|
||||
"type": "",
|
||||
"childParamsGroups": []
|
||||
}
|
||||
},
|
||||
@@ -67300,21 +67270,11 @@
|
||||
"childParamsGroups": []
|
||||
},
|
||||
{
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "string"
|
||||
},
|
||||
{
|
||||
"type": "object"
|
||||
},
|
||||
{
|
||||
"type": "array"
|
||||
}
|
||||
],
|
||||
"description": "<p>Can be any value - string, number, array or object.</p>",
|
||||
"name": "value",
|
||||
"in": "body",
|
||||
"type": "string or object or array",
|
||||
"description": "",
|
||||
"rawDescription": "Can be any value - string, number, array or object.",
|
||||
"type": "",
|
||||
"childParamsGroups": []
|
||||
}
|
||||
]
|
||||
|
||||
@@ -21932,10 +21932,16 @@
|
||||
"type": "string"
|
||||
},
|
||||
"config": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"config_was": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"content_type": {
|
||||
"type": "string"
|
||||
@@ -21955,10 +21961,16 @@
|
||||
"type": "string"
|
||||
},
|
||||
"events": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"events_were": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"explanation": {
|
||||
"type": "string"
|
||||
@@ -62121,10 +62133,16 @@
|
||||
"type": "string"
|
||||
},
|
||||
"config": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"config_was": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"content_type": {
|
||||
"type": "string"
|
||||
@@ -62144,10 +62162,16 @@
|
||||
"type": "string"
|
||||
},
|
||||
"events": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"events_were": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"explanation": {
|
||||
"type": "string"
|
||||
@@ -87721,7 +87745,10 @@
|
||||
"title": "Container Metadata",
|
||||
"properties": {
|
||||
"tags": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
@@ -87733,7 +87760,10 @@
|
||||
"title": "Docker Metadata",
|
||||
"properties": {
|
||||
"tag": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
@@ -88008,7 +88038,10 @@
|
||||
"title": "Container Metadata",
|
||||
"properties": {
|
||||
"tags": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
@@ -88020,7 +88053,10 @@
|
||||
"title": "Docker Metadata",
|
||||
"properties": {
|
||||
"tag": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
@@ -164020,6 +164056,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -164981,6 +165018,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -166196,6 +166234,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -177121,6 +177160,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -180428,7 +180468,7 @@
|
||||
},
|
||||
"delete": {
|
||||
"summary": "Delete a code scanning analysis from a repository",
|
||||
"description": "Deletes a specified code scanning analysis from a repository. For\nprivate repositories, you must use an access token with the `repo` scope. For public repositories,\nyou must use an access token with `public_repo` and `repo:security_events` scopes.\nGitHub Apps must have the `security_events` write permission to use this endpoint.\n\nYou can delete one analysis at a time.\nTo delete a series of analyses, start with the most recent analysis and work backwards.\nConceptually, the process is similar to the undo function in a text editor.\n\nWhen you list the analyses for a repository,\none or more will be identified as deletable in the response:\n\n```\n\"deletable\": true\n```\n\nAn analysis is deletable when it's the most recent in a set of analyses.\nTypically, a repository will have multiple sets of analyses\nfor each enabled code scanning tool,\nwhere a set is determined by a unique combination of analysis values:\n\n* `ref`\n* `tool`\n* `analysis_key`\n* `environment`\n\nIf you attempt to delete an analysis that is not the most recent in a set,\nyou'll get a 400 response with the message:\n\n```\nAnalysis specified is not deletable.\n```\n\nThe response from a successful `DELETE` operation provides you with\ntwo alternative URLs for deleting the next analysis in the set\n(see the example default response below).\nUse the `next_analysis_url` URL if you want to avoid accidentally deleting the final analysis\nin the set. This is a useful option if you want to preserve at least one analysis\nfor the specified tool in your repository.\nUse the `confirm_delete_url` URL if you are content to remove all analyses for a tool.\nWhen you delete the last analysis in a set the value of `next_analysis_url` and `confirm_delete_url`\nin the 200 response is `null`.\n\nAs an example of the deletion process,\nlet's imagine that you added a workflow that configured a particular code scanning tool\nto analyze the code in a repository. This tool has added 15 analyses:\n10 on the default branch, and another 5 on a topic branch.\nYou therefore have two separate sets of analyses for this tool.\nYou've now decided that you want to remove all of the analyses for the tool.\nTo do this you must make 15 separate deletion requests.\nTo start, you must find the deletable analysis for one of the sets,\nstep through deleting the analyses in that set,\nand then repeat the process for the second set.\nThe procedure therefore consists of a nested loop:\n\n**Outer loop**:\n* List the analyses for the repository, filtered by tool.\n* Parse this list to find a deletable analysis. If found:\n\n **Inner loop**:\n * Delete the identified analysis.\n * Parse the response for the value of `confirm_delete_url` and, if found, use this in the next iteration.\n\nThe above process assumes that you want to remove all trace of the tool's analyses from the GitHub user interface, for the specified repository, and it therefore uses the `confirm_delete_url` value. Alternatively, you could use the `next_analysis_url` value, which would leave the last analysis in each set undeleted to avoid removing a tool's analysis entirely.",
|
||||
"description": "Deletes a specified code scanning analysis from a repository. For\nprivate repositories, you must use an access token with the `repo` scope. For public repositories,\nyou must use an access token with `public_repo` and `repo:security_events` scopes.\nGitHub Apps must have the `security_events` write permission to use this endpoint.\n\nYou can delete one analysis at a time.\nTo delete a series of analyses, start with the most recent analysis and work backwards.\nConceptually, the process is similar to the undo function in a text editor.\n\nWhen you list the analyses for a repository,\none or more will be identified as deletable in the response:\n\n```\n\"deletable\": true\n```\n\nAn analysis is deletable when it's the most recent in a set of analyses.\nTypically, a repository will have multiple sets of analyses\nfor each enabled code scanning tool,\nwhere a set is determined by a unique combination of analysis values:\n\n* `ref`\n* `tool`\n* `analysis_key`\n* `environment`\n\nIf you attempt to delete an analysis that is not the most recent in a set,\nyou'll get a 400 response with the message:\n\n```\nAnalysis specified is not deletable.\n```\n\nThe response from a successful `DELETE` operation provides you with\ntwo alternative URLs for deleting the next analysis in the set:\n`next_analysis_url` and `confirm_delete_url`.\nUse the `next_analysis_url` URL if you want to avoid accidentally deleting the final analysis\nin a set. This is a useful option if you want to preserve at least one analysis\nfor the specified tool in your repository.\nUse the `confirm_delete_url` URL if you are content to remove all analyses for a tool.\nWhen you delete the last analysis in a set, the value of `next_analysis_url` and `confirm_delete_url`\nin the 200 response is `null`.\n\nAs an example of the deletion process,\nlet's imagine that you added a workflow that configured a particular code scanning tool\nto analyze the code in a repository. This tool has added 15 analyses:\n10 on the default branch, and another 5 on a topic branch.\nYou therefore have two separate sets of analyses for this tool.\nYou've now decided that you want to remove all of the analyses for the tool.\nTo do this you must make 15 separate deletion requests.\nTo start, you must find an analysis that's identified as deletable.\nEach set of analyses always has one that's identified as deletable.\nHaving found the deletable analysis for one of the two sets,\ndelete this analysis and then continue deleting the next analysis in the set until they're all deleted.\nThen repeat the process for the second set.\nThe procedure therefore consists of a nested loop:\n\n**Outer loop**:\n* List the analyses for the repository, filtered by tool.\n* Parse this list to find a deletable analysis. If found:\n\n **Inner loop**:\n * Delete the identified analysis.\n * Parse the response for the value of `confirm_delete_url` and, if found, use this in the next iteration.\n\nThe above process assumes that you want to remove all trace of the tool's analyses from the GitHub user interface, for the specified repository, and it therefore uses the `confirm_delete_url` value. Alternatively, you could use the `next_analysis_url` value, which would leave the last analysis in each set undeleted to avoid removing a tool's analysis entirely.",
|
||||
"operationId": "code-scanning/delete-analysis",
|
||||
"tags": [
|
||||
"code-scanning"
|
||||
@@ -183029,7 +183069,8 @@
|
||||
"storage_in_bytes",
|
||||
"memory_in_bytes",
|
||||
"cpus"
|
||||
]
|
||||
],
|
||||
"nullable": true
|
||||
},
|
||||
"created_at": {
|
||||
"type": "string",
|
||||
@@ -185303,7 +185344,8 @@
|
||||
"storage_in_bytes",
|
||||
"memory_in_bytes",
|
||||
"cpus"
|
||||
]
|
||||
],
|
||||
"nullable": true
|
||||
},
|
||||
"created_at": {
|
||||
"type": "string",
|
||||
@@ -198643,6 +198685,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -294324,6 +294367,9 @@
|
||||
},
|
||||
"domains": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"description": "Array of the domain set and its alternate name (if it is configured)",
|
||||
"example": [
|
||||
"example.com",
|
||||
@@ -294602,6 +294648,9 @@
|
||||
},
|
||||
"domains": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"description": "Array of the domain set and its alternate name (if it is configured)",
|
||||
"example": [
|
||||
"example.com",
|
||||
@@ -316721,7 +316770,8 @@
|
||||
"storage_in_bytes",
|
||||
"memory_in_bytes",
|
||||
"cpus"
|
||||
]
|
||||
],
|
||||
"nullable": true
|
||||
},
|
||||
"created_at": {
|
||||
"type": "string",
|
||||
@@ -318995,7 +319045,8 @@
|
||||
"storage_in_bytes",
|
||||
"memory_in_bytes",
|
||||
"cpus"
|
||||
]
|
||||
],
|
||||
"nullable": true
|
||||
},
|
||||
"created_at": {
|
||||
"type": "string",
|
||||
@@ -352714,17 +352765,7 @@
|
||||
"type": "string"
|
||||
},
|
||||
"value": {
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "string"
|
||||
},
|
||||
{
|
||||
"type": "object"
|
||||
},
|
||||
{
|
||||
"type": "array"
|
||||
}
|
||||
]
|
||||
"description": "Can be any value - string, number, array or object."
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
@@ -381215,6 +381256,14 @@
|
||||
"type": "integer",
|
||||
"default": 1
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "repository_id",
|
||||
"description": "ID of the Repository to filter on",
|
||||
"in": "query",
|
||||
"schema": {
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
],
|
||||
"responses": {
|
||||
@@ -383203,7 +383252,8 @@
|
||||
"storage_in_bytes",
|
||||
"memory_in_bytes",
|
||||
"cpus"
|
||||
]
|
||||
],
|
||||
"nullable": true
|
||||
},
|
||||
"created_at": {
|
||||
"type": "string",
|
||||
@@ -385981,7 +386031,8 @@
|
||||
"storage_in_bytes",
|
||||
"memory_in_bytes",
|
||||
"cpus"
|
||||
]
|
||||
],
|
||||
"nullable": true
|
||||
},
|
||||
"created_at": {
|
||||
"type": "string",
|
||||
@@ -388255,7 +388306,8 @@
|
||||
"storage_in_bytes",
|
||||
"memory_in_bytes",
|
||||
"cpus"
|
||||
]
|
||||
],
|
||||
"nullable": true
|
||||
},
|
||||
"created_at": {
|
||||
"type": "string",
|
||||
@@ -393906,7 +393958,8 @@
|
||||
"storage_in_bytes",
|
||||
"memory_in_bytes",
|
||||
"cpus"
|
||||
]
|
||||
],
|
||||
"nullable": true
|
||||
},
|
||||
"created_at": {
|
||||
"type": "string",
|
||||
@@ -396337,7 +396390,8 @@
|
||||
"storage_in_bytes",
|
||||
"memory_in_bytes",
|
||||
"cpus"
|
||||
]
|
||||
],
|
||||
"nullable": true
|
||||
},
|
||||
"created_at": {
|
||||
"type": "string",
|
||||
@@ -399102,7 +399156,8 @@
|
||||
"storage_in_bytes",
|
||||
"memory_in_bytes",
|
||||
"cpus"
|
||||
]
|
||||
],
|
||||
"nullable": true
|
||||
},
|
||||
"created_at": {
|
||||
"type": "string",
|
||||
@@ -401628,7 +401683,8 @@
|
||||
"storage_in_bytes",
|
||||
"memory_in_bytes",
|
||||
"cpus"
|
||||
]
|
||||
],
|
||||
"nullable": true
|
||||
},
|
||||
"created_at": {
|
||||
"type": "string",
|
||||
@@ -425098,7 +425154,10 @@
|
||||
"title": "Container Metadata",
|
||||
"properties": {
|
||||
"tags": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
@@ -425110,7 +425169,10 @@
|
||||
"title": "Docker Metadata",
|
||||
"properties": {
|
||||
"tag": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
@@ -425387,7 +425449,10 @@
|
||||
"title": "Container Metadata",
|
||||
"properties": {
|
||||
"tags": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
@@ -425399,7 +425464,10 @@
|
||||
"title": "Docker Metadata",
|
||||
"properties": {
|
||||
"tag": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
@@ -454265,7 +454333,10 @@
|
||||
"title": "Container Metadata",
|
||||
"properties": {
|
||||
"tags": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
@@ -454277,7 +454348,10 @@
|
||||
"title": "Docker Metadata",
|
||||
"properties": {
|
||||
"tag": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
@@ -454569,7 +454643,10 @@
|
||||
"title": "Container Metadata",
|
||||
"properties": {
|
||||
"tags": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
@@ -454581,7 +454658,10 @@
|
||||
"title": "Docker Metadata",
|
||||
"properties": {
|
||||
"tag": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
|
||||
@@ -140959,6 +140959,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -141920,6 +141921,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -143135,6 +143137,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -153856,6 +153859,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -169344,6 +169348,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -259710,6 +259715,9 @@
|
||||
},
|
||||
"domains": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"description": "Array of the domain set and its alternate name (if it is configured)",
|
||||
"example": [
|
||||
"example.com",
|
||||
@@ -259988,6 +259996,9 @@
|
||||
},
|
||||
"domains": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"description": "Array of the domain set and its alternate name (if it is configured)",
|
||||
"example": [
|
||||
"example.com",
|
||||
|
||||
@@ -66259,10 +66259,16 @@
|
||||
"type": "string"
|
||||
},
|
||||
"config": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"config_was": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"content_type": {
|
||||
"type": "string"
|
||||
@@ -66282,10 +66288,16 @@
|
||||
"type": "string"
|
||||
},
|
||||
"events": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"events_were": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"explanation": {
|
||||
"type": "string"
|
||||
@@ -142875,6 +142887,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -143836,6 +143849,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -145051,6 +145065,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -155772,6 +155787,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -172337,6 +172353,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -262703,6 +262720,9 @@
|
||||
},
|
||||
"domains": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"description": "Array of the domain set and its alternate name (if it is configured)",
|
||||
"example": [
|
||||
"example.com",
|
||||
@@ -262981,6 +263001,9 @@
|
||||
},
|
||||
"domains": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"description": "Array of the domain set and its alternate name (if it is configured)",
|
||||
"example": [
|
||||
"example.com",
|
||||
|
||||
@@ -67299,10 +67299,16 @@
|
||||
"type": "string"
|
||||
},
|
||||
"config": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"config_was": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"content_type": {
|
||||
"type": "string"
|
||||
@@ -67322,10 +67328,16 @@
|
||||
"type": "string"
|
||||
},
|
||||
"events": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"events_were": {
|
||||
"type": "array"
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
},
|
||||
"explanation": {
|
||||
"type": "string"
|
||||
@@ -146492,6 +146504,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -147453,6 +147466,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -148668,6 +148682,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -159437,6 +159452,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -176053,6 +176069,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -269452,6 +269469,9 @@
|
||||
},
|
||||
"domains": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"description": "Array of the domain set and its alternate name (if it is configured)",
|
||||
"example": [
|
||||
"example.com",
|
||||
@@ -269730,6 +269750,9 @@
|
||||
},
|
||||
"domains": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"description": "Array of the domain set and its alternate name (if it is configured)",
|
||||
"example": [
|
||||
"example.com",
|
||||
|
||||
@@ -115991,6 +115991,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -116952,6 +116953,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -118167,6 +118169,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -129076,6 +129079,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -145397,6 +145401,7 @@
|
||||
"nullable": true
|
||||
},
|
||||
"pull_requests": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"title": "Pull Request Minimal",
|
||||
"type": "object",
|
||||
@@ -236774,6 +236779,9 @@
|
||||
},
|
||||
"domains": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"description": "Array of the domain set and its alternate name (if it is configured)",
|
||||
"example": [
|
||||
"example.com",
|
||||
@@ -237052,6 +237060,9 @@
|
||||
},
|
||||
"domains": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"description": "Array of the domain set and its alternate name (if it is configured)",
|
||||
"example": [
|
||||
"example.com",
|
||||
@@ -286695,17 +286706,7 @@
|
||||
"type": "string"
|
||||
},
|
||||
"value": {
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "string"
|
||||
},
|
||||
{
|
||||
"type": "object"
|
||||
},
|
||||
{
|
||||
"type": "array"
|
||||
}
|
||||
]
|
||||
"description": "Can be any value - string, number, array or object."
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:1dec4ecf51b394d2b2c71b68f77df8b7c1765547eb32e4c8800b507b6ac96f00
|
||||
size 567844
|
||||
oid sha256:f725a37e6a3209777785d6af0e1b1e043d4cfec1cc5517a0d20350208d34a5ee
|
||||
size 567740
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:ba466a98f0a8eef883ed7a20fffd9538bb5ae7a9a3e942714ed38a62ed10a5b2
|
||||
size 985846
|
||||
oid sha256:bef7fdd10d3e8a38d1fa18666e9c0fc4ba717847866ae6f154902e3fda9620df
|
||||
size 985619
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:ca43c625ea2885492f43dba5d1510d5ec9cda71db6fefebb5cbebbe212d7ec4d
|
||||
size 471241
|
||||
oid sha256:f4bd66a4d97e440ad3999bf5b3aa91dfe43433de076352de221a59ba497bcf80
|
||||
size 470440
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:eed20830431e530167354206d667bd858426b156d3264e619513fa6c3adbfc16
|
||||
size 1844206
|
||||
oid sha256:292fa6f9a632411471dc7ed049438e4152071b3af4945b3285eb4ac107cfbe9a
|
||||
size 1843681
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:56e9bbaa9e7c2b883a1af2cc8ba0ecd22ce71dc5b1ab2b9d51710e35792eadec
|
||||
size 500156
|
||||
oid sha256:b25b94980121731ffa147774476b12f21d1e6afb098ac4c673ec4aa4d913309c
|
||||
size 500516
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:87cf9c81585da6cff4da4eab20cfedfcfbcd3e6805011889550a08802b632f01
|
||||
size 2100844
|
||||
oid sha256:28432b75535ff4355f0d70e1fe420fb742a997f3aff53c4476c09ebebd07f075
|
||||
size 2101435
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:26204258a431bde232ca8d0175a5f210d3c8c50f8682087ebd6c6cc7e1eafe4e
|
||||
size 592162
|
||||
oid sha256:134c48e2e2a74fa3cd963f8f978e3411340aafa4d3f482969f93a69d01e25c47
|
||||
size 592151
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:eb0d345a3676e871ea8a6aaf9a2750129fa24e8f71b25210f04dff7bd8500391
|
||||
size 3228894
|
||||
oid sha256:7d59da6976a621ea38bc4f684a3cf16bc942813404e0d58d26999eba3dee9716
|
||||
size 3228030
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:7bbde629590ca386648ff1e5fa608f176aed878546320b6503ca1a21a53b0c77
|
||||
size 498081
|
||||
oid sha256:ad1ea5e74b1bbcd6b10fc0bd28b91fe60be04ea037b0c65e306cf972dfeda511
|
||||
size 498053
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:0f36fa9e16a79d062e74d75694e59bea4383f04d5408177597be44bfc40a23e6
|
||||
size 2125039
|
||||
oid sha256:ed00a4337fb427ce8f139600251bd8dfe87e86bb9251c4402050bab95751bf7c
|
||||
size 2125697
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:0537d73713110ee57fdc15b36acad1d716bee491cbfd43e643301f1ed258809d
|
||||
size 581730
|
||||
oid sha256:eb4c96243d127507167c86f524a0c00ca79faf7b3e728bfdc46cd3407ec8c80e
|
||||
size 581936
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:ce6a3af8fcb2a2a70abd38a787f61173becb3f08092acc5fd0adab60d42a85a4
|
||||
size 1015201
|
||||
oid sha256:5a0300121a476d6eb456f2d70b99ee75350d5d5b11bdacbcb32848522d717a41
|
||||
size 1014940
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:8ce9fa7a1e9d524d12d3c424b2c169e71fad23efc6a4c018c88477521d29fef0
|
||||
size 480370
|
||||
oid sha256:1c5ff13f6d7d8155eb39de21ca9d5622578adb6918ad548fd32ea93f56ddf16e
|
||||
size 480550
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:faf31369460931b1c163bd6c2d337799de016d1a3292d92bd18155f2743d7159
|
||||
size 1888058
|
||||
oid sha256:8347fb651483d4134ee20daade08a8edd8b3d6c94ee4bc42d6bddf2968dbb570
|
||||
size 1887665
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:373d576e759232b22bd6c8a13fd0149503ba14918f6f29bf0993aad7a45b2a2c
|
||||
size 511245
|
||||
oid sha256:4a15305407a8619beafabccb5d211ef61fb3d1250d02294c9cc3f5d0b284992f
|
||||
size 511189
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:4c756f60d6041c7914a253f52e8c44d52e138732f0b0d758d536603462bcdb1e
|
||||
size 2148498
|
||||
oid sha256:1150eb57775d9367dd30aa63d2a397ea7a9f2e9ab26659dfa4901ff5a87e3eae
|
||||
size 2148283
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:df7034604ea1c3fa99a4209d9490b9495788a98917887b6c3844306d3123e438
|
||||
size 605507
|
||||
oid sha256:6fbff2d50295558f81d139a3fb9b08386c2a9cec389cc25511fafb9718d4fc6e
|
||||
size 605743
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:2f755261bd1c7db51e72358e0ab47e47fdb559a99cb9aacfb1c9d26f4927bbbe
|
||||
size 3309201
|
||||
oid sha256:912ecd0539f3b535363b1b4bfb670681307ef2801df86648877a9c4af8b9d350
|
||||
size 3310362
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:68894543372c63dad9932b5ecadd2f92a405fa9b555bed3b0853a361b5059718
|
||||
size 508898
|
||||
oid sha256:1a9c52f19be8088b7969958e088ad7eee82ed880d061d2b95f50d294296d0439
|
||||
size 508941
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:2f18797c2e312ccdca09ba668a4667e7a41cf6903ad4c3f1efc14dc6d978151b
|
||||
size 2173957
|
||||
oid sha256:4fddc53f7a8f77253d7cb97214aecf653421d99f9e3ba4179137d3b193aba093
|
||||
size 2174254
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:1915d8a859c4090b1eb6e3120e79399ac3ca86437777907b000ece957d036e1d
|
||||
size 594121
|
||||
oid sha256:faa7e676fdefc369e3affba1c7b85532b8a4adaf45b25384b5ed2f4f58711325
|
||||
size 594129
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:a3594d2793b68c52e03eb1417f0491bca13e13be5a0fc6f03ac2fe4fb78a3b13
|
||||
size 1036722
|
||||
oid sha256:7c8dc3a7003024443f26178f92a2d93b368359c52b83e7d3f44c90630e57727e
|
||||
size 1036733
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:209e790585cb541d5b7d0ad16e722d646ad623fbaf8677b974ca8b213aeb40aa
|
||||
size 489136
|
||||
oid sha256:af7799c014d6fd6679677fdee7dd95f5c65ced1d67ca1feba6bdfff6cadc9200
|
||||
size 489301
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:2c14f3814017536840e85345b9416da9eb96d6e5bd67d202c4cbeb3b95f4eb4c
|
||||
size 1921181
|
||||
oid sha256:bec0e2979bb636a42eb52c08fcbb67496ecd4a574cb2fd5dc0d38fc5e3a60de0
|
||||
size 1921067
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:027d2d1d0fbbf72dfdc3bc6959606b57e25ee89a720785273eb42e50bbc98c8c
|
||||
size 520352
|
||||
oid sha256:3253b7c60d4e8db1dcae43fa85451fb5a016d008869a7a4d7dadb1a7805ee4bf
|
||||
size 520323
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:b33c1ab97e53736a7432a9ee6d5c32b62c1b0a22fb9082ece44b3831bb3bae3a
|
||||
size 2186817
|
||||
oid sha256:2abf9133f5fa54557378f0c81092773ddf2bfb4c199ed2d5ac2cabf36af20873
|
||||
size 2186384
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:6e18e6e638f24f382e1599e77d36112fb6bd0b977ab3b23dca9f59ad5692c1f7
|
||||
size 616957
|
||||
oid sha256:20a3f2f24a7b5e873236022e2f090df17e58ca2f9038598a79fc8aed69935bb7
|
||||
size 617109
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:36ffc1a7257816a61c42ba779a7f89dc7a9371c34207a368190a2da6e3a7840a
|
||||
size 3377867
|
||||
oid sha256:14c20325cb941f3f89862bbd6a021294641c3348f16a6d55ec7f210d170d8a87
|
||||
size 3377360
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:248dcf82b4bb971ff6ff52e513dca3a97f51b54462b41996c1fdd4d5b4422023
|
||||
size 518841
|
||||
oid sha256:0b4ff62537748b8db053928995116b486d129bc1b7af1d02d8144c9ecf609096
|
||||
size 518547
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:cab39c161d0151ad8dcf5a0884407acc0b0bd2472d6f2b76d57bc3f3801c10c1
|
||||
size 2213931
|
||||
oid sha256:8fe215415e353e2c0b9f72ebbb926b5924914d635a9e5f1cc3dd21f0abe4dc26
|
||||
size 2213677
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:5f91fece8950ff955d78a74de38c2b3127dabff2577e1fe429006b27658bfdde
|
||||
size 819436
|
||||
oid sha256:97e99275c74260b2d3f2fc9a84f79b71a607730b141caeca8f06fe75cd2b1465
|
||||
size 819127
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:badb49304e7a2f5269435cc611e2dcdf7a2f6280050c62abe6bc435a25009fc9
|
||||
size 1329501
|
||||
oid sha256:2270fe66bab25d410e55314001e9153a7fc5c3b66a2d518b8b6c7bf436e24633
|
||||
size 1329231
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:16035d939d1b1beb8eb4e8fffbea9b19e3230f7cf6c6dbbf32c210b2d63b8b68
|
||||
size 654270
|
||||
oid sha256:4bcc1a487680a49c7e67f979d017161565dca3b93cb4a81c6d10ea6edcdfae1c
|
||||
size 654832
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:49d4e6edee64c4e42c9f919ec72f2beb2cabdf75163471a83622bc484c9240ed
|
||||
size 2450653
|
||||
oid sha256:b2f516b1f7b8f998aec2c1ae55c1096a5fe4d273e70e73c98486a0c13efe69cd
|
||||
size 2452447
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:86e99ef9eb469ff4e3dedde9326f0acdf0f5747d90ffb95f45e50dcda5895ab3
|
||||
size 711500
|
||||
oid sha256:f5799abb417e1cb25e9c70c01365c801f88533fc695c01ad91dfff8a69eccee1
|
||||
size 711498
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:89a12be7020a9c63a37f250c7c95695b160f1772b70ee04d241da5208fa1d873
|
||||
size 2906717
|
||||
oid sha256:d6b9f4e84946e56b24f025a2a100633367b859c8edff4b2a2cc72416e5819b2d
|
||||
size 2905615
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:f91c9abab648811bfe00fe862d1d97fd58bcb774aff5a69a5cc8d33f19363342
|
||||
size 842345
|
||||
oid sha256:ae18a78c0cc8fd1ef07d428ce542706f6c6ee2e6c12e77ed304ab7612eb1d663
|
||||
size 841983
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:8408e51813dcc1f10edfee74bbac35eb3627ae781ecbddd42f708a2b4f910ad7
|
||||
size 4496332
|
||||
oid sha256:13dc92d387639cd9fea8f6cfdc850c870c57d64aac153e502aea6fd2acddaa19
|
||||
size 4494321
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:e8d18a06f5d4c6d81baaf06b9edbe4b3a080f742eb00a462819e8ab688571169
|
||||
size 711878
|
||||
oid sha256:68e2c18df66d1bee91d33abc11a9c032249f77a8d715a34e1077bfeb540577aa
|
||||
size 712021
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:99f56b8e833e838bca4f5c7181f4c315be43b893e35b8e72fdf925c0e2359657
|
||||
size 2949329
|
||||
oid sha256:aba2ced9287783965e3e69cff8dca82bfebc51ad2133df046735663a27605e97
|
||||
size 2950351
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:51215693a97a50665d8ff6dcbeae1f0d6e2e6fbcf9458ac98936af21dccfdf08
|
||||
size 456135
|
||||
oid sha256:457e455be87083fe972fc7451843f0a05baceb6635f8dceea2fdac0fae2083bd
|
||||
size 455846
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:875f640231b842136f366665801222bbf712c47bfda6a013e0436bf88aba954a
|
||||
size 766285
|
||||
oid sha256:36ff986c554d0023a95297ecafca9f17a1d7c88bd36bf1bb364448b71fb0c5a6
|
||||
size 766401
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:57a201a66fc0322696070b3cdf95ea86a5a1ec2eed3085f62cba07077bbc2df4
|
||||
size 378103
|
||||
oid sha256:859580d47dedeca12f432c89a0b5cbd930dbfa9f43d82ab3b0edfd6aaa60706f
|
||||
size 378578
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:8630bd81905663af79a464c611241533d81e46e1fca000d167531ee1bd6f919d
|
||||
size 1417835
|
||||
oid sha256:b8a922cbed534ab42e37cfd9752130880f855cf9fc2037553fd16b99f0bd7c2d
|
||||
size 1417874
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:c736f39a78f5620c9f8d96d014e5974b34a1287bcfdbf5d5f0a03773c42b8f8a
|
||||
size 402831
|
||||
oid sha256:a63b875f9b798196eaa296f8391514df92e8b6416bdf639c45922df0750eedd4
|
||||
size 402607
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:0f3319a80c2c9c639e391062ff788160cbe3a7587717feaa53ff604eedc9236f
|
||||
size 1635880
|
||||
oid sha256:ab83f4e9d0c5eaa61812e2ccee30cbecc7ea24bde6135d6cde98a2dffb1ba541
|
||||
size 1635710
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:ac19aa8dd18e6ec97489c2aed7015643ad7909c0d466aa4bde240abf64f20dc5
|
||||
size 475381
|
||||
oid sha256:d6ab2284026f93f8c33010c9cfcefe705305912b3607b23f9613b3333ae534a1
|
||||
size 475515
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:fc4989b5b6ff627677993e6ef04ed22576d94505610ea6df4f9ae03fbc6d15e2
|
||||
size 2500233
|
||||
oid sha256:004b882f6b02d5f4b5cedeac69d8ab10ccb02b3289c3464731204b715c7a0fd9
|
||||
size 2501204
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:f034a962b3c44e6fd1c43cba55be6ea740356cf964b30ef89c34777ead5178c0
|
||||
size 401477
|
||||
oid sha256:5992fb4da846444fc66bb13bc79d7cb7c96568b891e43e65bb4333d64ab04af6
|
||||
size 401473
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:bb2b26f0f12fc6eeec941f98337ea27d8e6c3ca10e8e1f6f1f9b3e3f9cf461d7
|
||||
size 1655510
|
||||
oid sha256:050e0455b230a5c44bfa8b9755332a30316c961a0b375d11a047268a5aaaa1d0
|
||||
size 1654753
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:f276c0cbd7e5020edeb3d2d4555bf79fc583383c7ae822601e0f554b6d64f912
|
||||
size 699888
|
||||
oid sha256:3937a56a9689100ea2c845b7dfafebaaf4898b0b0d0991ce51aa8eade5edeefb
|
||||
size 699988
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:c1dfd0006bd4c566cb10b067618d852602249cdf5d1af4e7758da6dfad3c4851
|
||||
size 1245018
|
||||
oid sha256:e2e38f398a501a254c4737cd4b5a75ce190ef8bdb3cf8eb2d301bcf5d2a2272b
|
||||
size 1245258
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:901413bb111fd0af3c70fb15cfec09f0eff82c5c96200e337b16b3e898855591
|
||||
size 591482
|
||||
oid sha256:9696f2c72fc6d92a34e4abc59a8a5d338290ca2ca0a77744d440ec5814805c0c
|
||||
size 592197
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:399564db4ea7baf3e1bf18bd1057262ef167e0535fcbfc6546890781e737c844
|
||||
size 2331970
|
||||
oid sha256:ab8e118a821966eba31d9bdf3558778639ee1c4a2f4659a58830685d2829a861
|
||||
size 2332544
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:c2d79ed33168a9d90e3af0d036699de78130065409dc826c2e558dfffc3a5e49
|
||||
size 629263
|
||||
oid sha256:897e5c5ae205a2852c79980d7cbdb6dc15cbfd7d6c09dea2ffb9fe77da93a62e
|
||||
size 629240
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:45a0e25e6fc8bbe7f36f6f9b6b75dd0de3678dc8ecd89770f53850e9188e34ac
|
||||
size 2641962
|
||||
oid sha256:1463f92eec829188763ce4d7351c16154517cab02cb0ddd73beeaafeefe6ef8d
|
||||
size 2642431
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:11e8192190038392c8e603f5b705faa607e956236a77415740e144874cfe181f
|
||||
size 726601
|
||||
oid sha256:2a136be30eb41c2fd04c5fbd32e57e99899f8e1b718a79a08b7d30fc45d4db6d
|
||||
size 726876
|
||||
|
||||
@@ -1,3 +1,3 @@
|
||||
version https://git-lfs.github.com/spec/v1
|
||||
oid sha256:63c93a94320c696a9fc7a58f648600aa79d25004baccaecd894fd811fc72ef4d
|
||||
size 3991879
|
||||
oid sha256:fa12cbdfe57fb50c7d83d9d6a07602866e9ff1cacc9323240689c94fadb5b02a
|
||||
size 3994132
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user