1
0
mirror of synced 2026-01-07 09:01:31 -05:00

Merge pull request #29141 from github/repo-sync

Repo sync
This commit is contained in:
docs-bot
2023-10-13 10:32:44 -07:00
committed by GitHub
64 changed files with 1294 additions and 2487 deletions

View File

@@ -0,0 +1,44 @@
---
title: Changing the hostname for your instance
shortTitle: Change hostname
intro: "If you want to change the hostname for an existing {% data variables.product.prodname_ghe_server %} instance, you must restore the settings and data to a new instance."
versions:
ghes: '*'
type: how_to
topics:
- Enterprise
- Fundamentals
- Infrastructure
---
## About changes to the hostname for {% data variables.product.product_name %}
If you need to use a new hostname for {% data variables.location.product_location %}, you must back up the existing instance's settings and data, configure a new instance, restore the backup to the new instance, and then adjust your DNS configuration to send traffic to the new instance.
Migration to a new instance requires downtime. The amount of downtime required depends on how much data you need to back up, as well as the speed of the network connection between the backup host and the instances.
In this article, the term "source instance" refers to the instance with the old hostname, and "destination instance" refers to the instance with the new hostname.
{% data reusables.enterprise_installation.changing-hostname-not-supported %}
## Migrating to an instance with a new hostname
1. Configure a destination instance of {% data variables.product.prodname_ghe_server %} with the new hostname you'd like to use. For more information, see the following documentation.
- "[AUTOTITLE](/admin/installation/setting-up-a-github-enterprise-server-instance)"
- "[AUTOTITLE](/admin/configuration/configuring-network-settings/configuring-the-hostname-for-your-instance)"
1. Inform the instance's users of the scheduled downtime. Optionally, you can create a mandatory message that will appear for all users who sign in. For more information, see "[Customizing user messages for your enterprise](/admin/managing-accounts-and-repositories/communicating-information-to-users-in-your-enterprise/customizing-user-messages-for-your-enterprise#creating-a-mandatory-message)."
1. On the source instance, enable maintenance mode. For more information, see "[AUTOTITLE](/admin/administering-your-instance/configuring-maintenance-mode/enabling-and-scheduling-maintenance-mode#enabling-maintenance-mode-immediately-or-scheduling-a-maintenance-window-for-a-later-time)."
1. Back up the source instance's data and settings using {% data variables.product.prodname_enterprise_backup_utilities %}. For more information, see "[AUTOTITLE](/admin/backing-up-and-restoring-your-instance/configuring-backups-on-your-instance)."
1. Restore the backup to the destination instance with the desired hostname. When you run the `ghe-restore` utility, use the `-c` option to overwrite the destination instance's configuration. For more information, see "[AUTOTITLE](/admin/backing-up-and-restoring-your-instance/configuring-backups-on-your-instance)."
1. Finalize configuration of the destination instance. For more information, see "[AUTOTITLE](/admin/configuration)."
1. On the destination instance, enable maintenance mode.
1. Optionally, while the destination instance is in maintenance mode, validate the instance's configuration and verify that user data is intact. For more information, see "[AUTOTITLE](/admin/administering-your-instance/configuring-maintenance-mode/enabling-and-scheduling-maintenance-mode#validating-changes-in-maintenance-mode-using-the-ip-exception-list)."
1. To direct traffic to the destination instance, update the DNS `CNAME` record with the source instance's hostname to resolve to the IP address of the destination instance.
{% note %}
**Note**: Restored user-generated content in the instance's web application will likely contain URLs that reference the source instance's old hostname. Optionally, to ensure that these links continue to resolve to the destination instance, you can configure a redirect using DNS. In addition to the `CNAME` record that resolves to the new instance's hostname, configure a second DNS `CNAME` record that directs traffic from the original hostname to the new hostname. For more information, see the documentation for your DNS provider.
{% endnote %}
1. On the destination instance, disable maintenance mode.

View File

@@ -1,11 +1,13 @@
---
title: Configuring a hostname
intro: We recommend setting a hostname for your appliance instead of using a hard-coded IP address.
title: Configuring the hostname for your instance
shortTitle: Configure hostname
intro: "You can provide reliable access to {% data variables.location.product_location %} by assigning a hostname that's accessible over your network."
redirect_from:
- /enterprise/admin/guides/installation/configuring-hostnames
- /enterprise/admin/installation/configuring-a-hostname
- /enterprise/admin/configuration/configuring-a-hostname
- /admin/configuration/configuring-a-hostname
- /admin/configuration/configuring-network-settings/configuring-a-hostname
versions:
ghes: '*'
type: how_to
@@ -14,7 +16,10 @@ topics:
- Fundamentals
- Infrastructure
---
If you configure a hostname instead of a hard-coded IP address, you will be able to change the physical hardware that {% data variables.location.product_location %} runs on without affecting users or client software.
## About the hostname for {% data variables.product.product_name %}
To provide reliable access to {% data variables.location.product_location %} via a known name on the network, you can configure a hostname. If you configure a hostname instead of using a hard-coded IP address, you will be able to change the physical hardware that {% data variables.location.product_location %} runs on without affecting users or client software.
The hostname setting in the {% data variables.enterprise.management_console %} should be set to an appropriate fully qualified domain name (FQDN) which is resolvable on the internet or within your internal network. For example, your hostname setting could be `github.companyname.com.` Web and API requests will automatically redirect to the hostname configured in the {% data variables.enterprise.management_console %}. Note that `localhost` is not a valid hostname setting.
@@ -22,9 +27,11 @@ Hostnames must be less than 63 characters in length per [Section 2.3.4 of the Do
After you configure a hostname, you can enable subdomain isolation to further increase the security of {% data variables.location.product_location %}. For more information, see "[AUTOTITLE](/admin/configuration/configuring-network-settings/enabling-subdomain-isolation)."
{% data variables.product.company_short %} strongly recommends that you do not change the hostname for an existing {% data variables.product.product_name %} instance. Changing the hostname will cause unexpected behavior, up to and including instance outages. Instead, configure a new instance with the desired hostname, and then restore settings and data from the original instance to the new instance.
For more information on the supported hostname types, see [Section 2.1 of the HTTP RFC](https://tools.ietf.org/html/rfc1123#section-2).
{% data reusables.enterprise_installation.changing-hostname-not-supported %}
## Configuring the hostname
{% data reusables.enterprise_site_admin_settings.access-settings %}
{% data reusables.enterprise_site_admin_settings.management-console %}
@@ -35,3 +42,9 @@ For more information on the supported hostname types, see [Section 2.1 of the HT
{% data reusables.enterprise_management_console.save-settings %}
To help mitigate various cross-site scripting vulnerabilities, we recommend that you enable subdomain isolation for {% data variables.location.product_location %} after you configure a hostname. For more information, see "[AUTOTITLE](/admin/configuration/configuring-network-settings/enabling-subdomain-isolation)."
## Changing the hostname
If you need to change the hostname for {% data variables.location.product_location %}, you must restore a backup of your existing instance to a new instance with the desired hostname. For more information, see "[AUTOTITLE](/admin/configuration/configuring-network-settings/changing-the-hostname-for-your-instance)."
{% data reusables.enterprise_installation.changing-hostname-not-supported %}

View File

@@ -15,7 +15,8 @@ topics:
children:
- /configuring-the-ip-address-using-the-virtual-machine-console
- /configuring-dns-nameservers
- /configuring-a-hostname
- /configuring-the-hostname-for-your-instance
- /changing-the-hostname-for-your-instance
- /validating-your-domain-settings
- /configuring-an-outbound-web-proxy-server
- /configuring-built-in-firewall-rules

View File

@@ -40,7 +40,8 @@ includeGuides:
- /admin/identity-and-access-management/using-saml-for-enterprise-iam
- /admin/administering-your-instance/administering-your-instance-from-the-command-line/accessing-the-administrative-shell-ssh
- /admin/administering-your-instance/administering-your-instance-from-the-web-ui
- /admin/configuration/configuring-network-settings/configuring-a-hostname
- /admin/configuration/configuring-network-settings/configuring-the-hostname-for-your-instance
- /admin/configuration/configuring-network-settings/changing-the-hostname-for-your-instance
- /admin/backing-up-and-restoring-your-instance/configuring-backups-on-your-instance
- /admin/configuration/configuring-network-settings/configuring-built-in-firewall-rules
- /admin/code-security/managing-github-advanced-security-for-your-enterprise/configuring-code-scanning-for-your-appliance

View File

@@ -0,0 +1,345 @@
---
title: Automatically redelivering failed deliveries for a GitHub App webhook
shortTitle: 'Automatically redeliver for {% data variables.product.prodname_github_app %}'
intro: 'You can write a script to handle failed deliveries of a {% data variables.product.prodname_github_app %} webhook.'
versions:
fpt: '*'
ghes: '*'
ghae: '*'
ghec: '*'
topics:
- Webhooks
layout: inline
redirect_from:
- /webhooks/using-webhooks/creating-a-script-to-automatically-redeliver-failed-deliveries-for-a-github-app-webhook
---
## About automatically redelivering failed deliveries
This article describes how to write a script to find and redeliver failed deliveries for a {% data variables.product.prodname_github_app %} webhook. For more information about failed deliveries, see "[AUTOTITLE](/webhooks/using-webhooks/handling-failed-webhook-deliveries)."
This example shows you:
- A script that will find and redeliver failed deliveries for a {% data variables.product.prodname_github_app %} webhook
- What credentials your script will need, and how to store the credentials securely as {% data variables.product.prodname_actions %} secrets
- A {% data variables.product.prodname_actions %} workflow that can securely access your credentials and run the script periodically
This example uses {% data variables.product.prodname_actions %}, but you can also run this script on your server that handles webhook deliveries. For more information, see "[Alternative methods](#alternative-methods)."
## Storing credentials for the script
The endpoints to find and redeliver failed webhooks require a JSON web token, which is generated from the app ID and private key for your app.
The endpoints to fetch and update the value of environment variables require a {% data variables.product.pat_generic %}, {% data variables.product.prodname_github_app %} installation access token, or {% data variables.product.prodname_github_app %} user access token. This example uses a {% data variables.product.pat_generic %}. If your {% data variables.product.prodname_github_app %} is installed on the repository where this workflow will run and has permission to write repository variables, you can modify this example to create an installation access token during the {% data variables.product.prodname_actions %} workflow instead of using a {% data variables.product.pat_generic %}. For more information, see "[AUTOTITLE](/apps/creating-github-apps/authenticating-with-a-github-app/making-authenticated-api-requests-with-a-github-app-in-a-github-actions-workflow)."
1. Find the app ID for your {% data variables.product.prodname_github_app %}. You can find the app ID on the settings page for your app. The app ID is different from the client ID. For more information about navigating to the settings page for your {% data variables.product.prodname_github_app %}, see "[AUTOTITLE](/apps/maintaining-github-apps/modifying-a-github-app-registration#navigating-to-your-github-app-settings)."
1. Store the app ID from the previous step as a {% data variables.product.prodname_actions %} secret in the repository where you want the workflow to run. For more information about storing secrets, see "[AUTOTITLE](/actions/security-guides/encrypted-secrets)."
1. Generate a private key for your app. For more information about generating a private key, see "[AUTOTITLE](/apps/creating-github-apps/authenticating-with-a-github-app/managing-private-keys-for-github-apps)."
1. Store the private key, including `-----BEGIN RSA PRIVATE KEY-----` and `-----END RSA PRIVATE KEY-----`, from the previous step as a {% data variables.product.prodname_actions %} secret in the repository where you want the workflow to run.
{% ifversion pat-v2 %}
1. Create a {% data variables.product.pat_generic %} with the following access. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens)."
- For a {% data variables.product.pat_v2 %}, grant the token:
- Write access to the repository variables permission
- Access to the repository where this workflow will run
- For a {% data variables.product.pat_v1 %}, grant the token the `repo` scope.
{% else %}
1. Create a {% data variables.product.pat_v1 %} with the `repo` scope. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens)."
{% endif %}
1. Store your {% data variables.product.pat_generic %} from the previous step as a {% data variables.product.prodname_actions %} secret in the repository where you want the workflow to run.
## Adding a workflow that will run the script
This section demonstrates how you can use a {% data variables.product.prodname_actions %} workflow to securely access the credentials that you stored in the previous section, set environment variables, and periodically run a script to find and redeliver failed deliveries.
Copy this {% data variables.product.prodname_actions %} workflow into a YAML file in the `.github/workflows` directory in the repository where you want the workflow to run. Replace the placeholders in the `Run script` step as described below.
```yaml copy annotate
#
name: Redeliver failed webhook deliveries
# This workflow runs every 6 hours or when manually triggered.
on:
schedule:
- cron: '40 */6 * * *'
workflow_dispatch:
# This workflow will use the built in `GITHUB_TOKEN` to check out the repository contents. This grants `GITHUB_TOKEN` permission to do that.
permissions:
contents: read
#
jobs:
redeliver-failed-deliveries:
name: Redeliver failed deliveries
runs-on: ubuntu-latest
steps:
# This workflow will run a script that is stored in the repository. This step checks out the repository contents so that the workflow can access the script.
- name: Check out repo content
uses: {% data reusables.actions.action-checkout %}
# This step sets up Node.js. The script that this workflow will run uses Node.js.
- name: Setup Node.js
uses: {% data reusables.actions.action-setup-node %}
with:
node-version: '18.x'
# This step installs the octokit library. The script that this workflow will run uses the octokit library.
- name: Install dependencies
run: npm install octokit
# This step sets some environment variables, then runs a script to find and redeliver failed webhook deliveries.
# - Replace `YOUR_APP_ID_SECRET_NAME` with the name of the secret where you stored your app ID.
# - Replace `YOUR_PRIVATE_KEY_SECRET_NAME` with the name of the secret where you stored your private key.
# - Replace `YOUR_TOKEN_SECRET_NAME` with the name of the secret where you stored your {% data variables.product.pat_generic %}.
# - Replace `YOUR_LAST_REDELIVERY_VARIABLE_NAME` with the name that you want to use for a configuration variable that will be stored in the repository where this workflow is stored. The name can be any string that contains only alphanumeric characters and `_`, and does not start with `GITHUB_` or a number. For more information, see "[AUTOTITLE](/actions/learn-github-actions/variables#defining-configuration-variables-for-multiple-workflows)."
{% ifversion ghes or ghae %}# - Replace `YOUR_HOSTNAME` with the name of {% data variables.location.product_location %}.{% endif %}
- name: Run script
env:
APP_ID: {% raw %}${{ secrets.YOUR_APP_ID_SECRET_NAME }}{% endraw %}
PRIVATE_KEY: {% raw %}${{ secrets.YOUR_PRIVATE_KEY_SECRET_NAME }}{% endraw %}
TOKEN: {% raw %}${{ secrets.YOUR_TOKEN_SECRET_NAME }}{% endraw %}
LAST_REDELIVERY_VARIABLE_NAME: 'YOUR_LAST_REDELIVERY_VARIABLE_NAME'
{% ifversion ghes or ghae %}HOSTNAME: 'YOUR_HOSTNAME'{% endif %}
WORKFLOW_REPO: {% raw %}${{ github.event.repository.name }}{% endraw %}
WORKFLOW_REPO_OWNER: {% raw %}${{ github.repository_owner }}{% endraw %}
run: |
node .github/workflows/scripts/redeliver-failed-deliveries.js
```
## Adding the script
This section demonstrates how you can write a script to find and redeliver failed deliveries.
Copy this script into a file called `.github/workflows/scripts/redeliver-failed-deliveries.js` in the same repository where you saved the {% data variables.product.prodname_actions %} workflow file above.
```javascript copy annotate
// This script uses {% data variables.product.company_short %}'s Octokit SDK to make API requests. For more information, see "[AUTOTITLE](/rest/guides/scripting-with-the-rest-api-and-javascript)."
const { App, Octokit } = require("octokit");
//
async function checkAndRedeliverWebhooks() {
// Get the values of environment variables that were set by the {% data variables.product.prodname_actions %} workflow.
const APP_ID = process.env.APP_ID;
const PRIVATE_KEY = process.env.PRIVATE_KEY;
const TOKEN = process.env.TOKEN;
const LAST_REDELIVERY_VARIABLE_NAME = process.env.LAST_REDELIVERY_VARIABLE_NAME;
{% ifversion ghes or ghae %}const HOSTNAME = process.env.HOSTNAME;{% endif %}
const WORKFLOW_REPO_NAME = process.env.WORKFLOW_REPO;
const WORKFLOW_REPO_OWNER = process.env.WORKFLOW_REPO_OWNER;
// Create an instance of the octokit `App` using the {% ifversion ghes or ghae %}app ID, private key, and hostname{% else %}app ID and private key{% endif %} values that were set in the {% data variables.product.prodname_actions %} workflow.
//
// This will be used to make API requests to the webhook-related endpoints.
const app = new App({
appId: APP_ID,
privateKey: PRIVATE_KEY,{% ifversion ghes or ghae %}
Octokit: Octokit.defaults({
baseUrl: "{% data variables.product.api_url_code %}",
}),{% endif %}
});
// Create an instance of `Octokit` using the token{% ifversion ghes or ghae %} and hostname{% endif %} values that were set in the {% data variables.product.prodname_actions %} workflow.
//
// This will be used to update the configuration variable that stores the last time that this script ran.
const octokit = new Octokit({ {% ifversion ghes or ghae %}
baseUrl: "{% data variables.product.api_url_code %}",{% endif %}
auth: TOKEN,
});
try {
// Get the last time that this script ran from the configuration variable. If the variable is not defined, use the current time minus 24 hours.
const lastStoredRedeliveryTime = await getVariable({
variableName: LAST_REDELIVERY_VARIABLE_NAME,
repoOwner: WORKFLOW_REPO_OWNER,
repoName: WORKFLOW_REPO_NAME,
octokit,
});
const lastWebhookRedeliveryTime = lastStoredRedeliveryTime || (Date.now() - (24 * 60 * 60 * 1000)).toString();
// Record the time that this script started redelivering webhooks.
const newWebhookRedeliveryTime = Date.now().toString();
// Get the webhook deliveries that were delivered after `lastWebhookRedeliveryTime`.
const deliveries = await fetchWebhookDeliveriesSince({lastWebhookRedeliveryTime, app});
// Consolidate deliveries that have the same globally unique identifier (GUID). The GUID is constant across redeliveries of the same delivery.
let deliveriesByGuid = {};
for (const delivery of deliveries) {
deliveriesByGuid[delivery.guid]
? deliveriesByGuid[delivery.guid].push(delivery)
: (deliveriesByGuid[delivery.guid] = [delivery]);
}
// For each GUID value, if no deliveries for that GUID have been successfully delivered within the time frame, get the delivery ID of one of the deliveries with that GUID.
//
// This will prevent duplicate redeliveries if a delivery has failed multiple times.
// This will also prevent redelivery of failed deliveries that have already been successfully redelivered.
let failedDeliveryIDs = [];
for (const guid in deliveriesByGuid) {
const deliveries = deliveriesByGuid[guid];
const anySucceeded = deliveries.some(
(delivery) => delivery.status === "OK"
);
if (!anySucceeded) {
failedDeliveryIDs.push(deliveries[0].id);
}
}
// Redeliver any failed deliveries.
for (const deliveryId of failedDeliveryIDs) {
await redeliverWebhook({deliveryId, app});
}
// Update the configuration variable (or create the variable if it doesn't already exist) to store the time that this script started.
// This value will be used next time this script runs.
await updateVariable({
variableName: LAST_REDELIVERY_VARIABLE_NAME,
value: newWebhookRedeliveryTime,
variableExists: Boolean(lastStoredRedeliveryTime),
repoOwner: WORKFLOW_REPO_OWNER,
repoName: WORKFLOW_REPO_NAME,
octokit,
});
// Log the number of redeliveries.
console.log(
`Redelivered ${
failedDeliveryIDs.length
} failed webhook deliveries out of ${
deliveries.length
} total deliveries since ${Date(lastWebhookRedeliveryTime)}.`
);
} catch (error) {
// If there was an error, log the error so that it appears in the workflow run log, then throw the error so that the workflow run registers as a failure.
if (error.response) {
console.error(
`Failed to check and redeliver webhooks: ${error.response.data.message}`
);
}
console.error(error);
throw(error);
}
}
// This function will fetch all of the webhook deliveries that were delivered since `lastWebhookRedeliveryTime`.
// It uses the `octokit.paginate.iterator()` method to iterate through paginated results. For more information, see "[AUTOTITLE](/rest/guides/scripting-with-the-rest-api-and-javascript#making-paginated-requests)."
//
// If a page of results includes deliveries that occurred before `lastWebhookRedeliveryTime`,
// it will store only the deliveries that occurred after `lastWebhookRedeliveryTime` and then stop.
// Otherwise, it will store all of the deliveries from the page and request the next page.
async function fetchWebhookDeliveriesSince({lastWebhookRedeliveryTime, app}) {
const iterator = app.octokit.paginate.iterator(
"GET /app/hook/deliveries",
{
per_page: 100,{% ifversion api-date-versioning %}
headers: {
"x-github-api-version": "{{ allVersions[currentVersion].latestApiVersion }}",
},{% endif %}
}
);
const deliveries = [];
for await (const { data } of iterator) {
const oldestDeliveryTimestamp = new Date(
data[data.length - 1].delivered_at
).getTime();
if (oldestDeliveryTimestamp < lastWebhookRedeliveryTime) {
for (const delivery of data) {
if (
new Date(delivery.delivered_at).getTime() > lastWebhookRedeliveryTime
) {
deliveries.push(delivery);
} else {
break;
}
}
break;
} else {
deliveries.push(...data);
}
}
return deliveries;
}
// This function will redeliver a failed webhook delivery.
async function redeliverWebhook({deliveryId, app}) {
await app.octokit.request("POST /app/hook/deliveries/{delivery_id}/attempts", {
delivery_id: deliveryId,
});
}
// This function gets the value of a configuration variable.
// If the variable does not exist, the endpoint returns a 404 response and this function returns `undefined`.
async function getVariable({ variableName, repoOwner, repoName, octokit }) {
try {
const {
data: { value },
} = await octokit.request(
"GET /repos/{owner}/{repo}/actions/variables/{name}",
{
owner: repoOwner,
repo: repoName,
name: variableName,
}
);
return value;
} catch (error) {
if (error.status === 404) {
return undefined;
} else {
throw error;
}
}
}
// This function will update a configuration variable (or create the variable if it doesn't already exist). For more information, see "[AUTOTITLE](/actions/learn-github-actions/variables#defining-configuration-variables-for-multiple-workflows)."
async function updateVariable({
variableName,
value,
variableExists,
repoOwner,
repoName,
octokit,
}) {
if (variableExists) {
await octokit.request(
"PATCH /repos/{owner}/{repo}/actions/variables/{name}",
{
owner: repoOwner,
repo: repoName,
name: variableName,
value: value,
}
);
} else {
await octokit.request("POST /repos/{owner}/{repo}/actions/variables", {
owner: repoOwner,
repo: repoName,
name: variableName,
value: value,
});
}
}
// This will execute the `checkAndRedeliverWebhooks` function.
(async () => {
await checkAndRedeliverWebhooks();
})();
```
## Testing the script
You can manually trigger your workflow to test the script. For more information, see "[AUTOTITLE](/actions/using-workflows/manually-running-a-workflow)" and "[AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/using-workflow-run-logs)."
## Alternative methods
This example used {% data variables.product.prodname_actions %} to securely store credentials and to run the script on a schedule. However, if you prefer to run this script on your server than handles webhook deliveries, you can:
- Store the credentials in another secure manner, such as a secret manager like [Azure key vault](https://azure.microsoft.com/products/key-vault). You will also need to update the script to access the credentials from their new location.
- Run the script on a schedule on your server, for example by using a cron job or task scheduler.
- Update the script to store the last run time somewhere that your server can access and update. If you choose not to store the last run time as a {% data variables.product.prodname_actions %} secret, you do not need to use a {% data variables.product.pat_generic %}, and you can remove the API calls to access and update the configuration variable.

View File

@@ -0,0 +1,363 @@
---
title: Automatically redelivering failed deliveries for a repository webhook
shortTitle: Automatically redeliver for repository
intro: You can write a script to handle failed deliveries of a repository webhook.
versions:
fpt: '*'
ghes: '*'
ghae: '*'
ghec: '*'
topics:
- Webhooks
layout: inline
redirect_from:
- /webhooks/using-webhooks/creating-a-script-to-automatically-redeliver-failed-deliveries-for-a-repository-webhook
---
## About automatically redelivering failed deliveries
This article describes how to write a script to find and redeliver failed deliveries for a repository webhook. For more information about failed deliveries, see "[AUTOTITLE](/webhooks/using-webhooks/handling-failed-webhook-deliveries)."
This example shows you:
- A script that will find and redeliver failed deliveries for a repository webhook
- What credentials your script will need, and how to store the credentials securely as {% data variables.product.prodname_actions %} secrets
- A {% data variables.product.prodname_actions %} workflow that can securely access your credentials and run the script periodically
This example uses {% data variables.product.prodname_actions %}, but you can also run this script on your server that handles webhook deliveries. For more information, see "[Alternative methods](#alternative-methods)."
## Storing credentials for the script
The built in `GITHUB_TOKEN` does not have sufficient permissions to redeliver webhooks. Instead of using `GITHUB_TOKEN`, this example uses a {% data variables.product.pat_generic %}. Alternatively, instead of creating a {% data variables.product.pat_generic %}, you can create a {% data variables.product.prodname_github_app %} and use the app's credentials to create an installation access token during the {% data variables.product.prodname_actions %} workflow. For more information, see "[AUTOTITLE](/apps/creating-github-apps/authenticating-with-a-github-app/making-authenticated-api-requests-with-a-github-app-in-a-github-actions-workflow)."
{% ifversion pat-v2 %}
1. Create a {% data variables.product.pat_generic %} with the following access. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens)."
- For a {% data variables.product.pat_v2 %}, grant the token:
- Access to the repository where your webhook was created
- Access to the repository where this workflow will run
- Write access to the repository webhooks permission
- Write access to the repository variables permission
- For a {% data variables.product.pat_v1 %}, grant the token the `repo` scope.
{% else %}
1. Create a {% data variables.product.pat_v1 %} with the `repo` scope. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens)."
{% endif %}
1. Store your {% data variables.product.pat_generic %} as a {% data variables.product.prodname_actions %} secret in the repository where you want the workflow to run. For more information, see "[AUTOTITLE](/actions/security-guides/encrypted-secrets)."
## Adding a workflow that will run the script
This section demonstrates how you can use a {% data variables.product.prodname_actions %} workflow to securely access the credentials that you stored in the previous section, set environment variables, and periodically run a script to find and redeliver failed deliveries.
Copy this {% data variables.product.prodname_actions %} workflow into a YAML file in the `.github/workflows` directory in the repository where you want the workflow to run. Replace the placeholders in the `Run script` step as described below.
```yaml copy annotate
#
name: Redeliver failed webhook deliveries
# This workflow runs every 6 hours or when manually triggered.
on:
schedule:
- cron: '20 */6 * * *'
workflow_dispatch:
# This workflow will use the built in `GITHUB_TOKEN` to check out the repository contents. This grants `GITHUB_TOKEN` permission to do that.
permissions:
contents: read
#
jobs:
redeliver-failed-deliveries:
name: Redeliver failed deliveries
runs-on: ubuntu-latest
steps:
# This workflow will run a script that is stored in the repository. This step checks out the repository contents so that the workflow can access the script.
- name: Check out repo content
uses: {% data reusables.actions.action-checkout %}
# This step sets up Node.js. The script that this workflow will run uses Node.js.
- name: Setup Node.js
uses: {% data reusables.actions.action-setup-node %}
with:
node-version: '18.x'
# This step installs the octokit library. The script that this workflow will run uses the octokit library.
- name: Install dependencies
run: npm install octokit
# This step sets some environment variables, then runs a script to find and redeliver failed webhook deliveries.
# - Replace `YOUR_SECRET_NAME` with the name of the secret where you stored your {% data variables.product.pat_generic %}.
# - Replace `YOUR_REPO_OWNER` with the owner of the repository where the webhook was created.
# - Replace `YOUR_REPO_NAME` with the name of the repository where the webhook was created.
# - Replace `YOUR_HOOK_ID` with the ID of the webhook.
# - Replace `YOUR_LAST_REDELIVERY_VARIABLE_NAME` with the name that you want to use for a configuration variable that will be stored in the repository where this workflow is stored. The name can be any string that contains only alphanumeric characters and `_`, and does not start with `GITHUB_` or a number. For more information, see "[AUTOTITLE](/actions/learn-github-actions/variables#defining-configuration-variables-for-multiple-workflows)."
{% ifversion ghes or ghae %}# - Replace `YOUR_HOSTNAME` with the name of {% data variables.location.product_location %}.{% endif %}
- name: Run script
env:
TOKEN: {% raw %}${{ secrets.YOUR_SECRET_NAME }}{% endraw %}
REPO_OWNER: 'YOUR_REPO_OWNER'
REPO_NAME: 'YOUR_REPO_NAME'
HOOK_ID: 'YOUR_HOOK_ID'
LAST_REDELIVERY_VARIABLE_NAME: 'YOUR_LAST_REDELIVERY_VARIABLE_NAME'
{% ifversion ghes or ghae %}HOSTNAME: 'YOUR_HOSTNAME'{% endif %}
WORKFLOW_REPO_NAME: {% raw %}${{ github.event.repository.name }}{% endraw %}
WORKFLOW_REPO_OWNER: {% raw %}${{ github.repository_owner }}{% endraw %}
run: |
node .github/workflows/scripts/redeliver-failed-deliveries.js
```
## Adding the script
This section demonstrates how you can write a script to find and redeliver failed deliveries.
Copy this script into a file called `.github/workflows/scripts/redeliver-failed-deliveries.js` in the same repository where you saved the {% data variables.product.prodname_actions %} workflow file above.
```javascript copy annotate
// This script uses {% data variables.product.company_short %}'s Octokit SDK to make API requests. For more information, see "[AUTOTITLE](/rest/guides/scripting-with-the-rest-api-and-javascript)."
const { Octokit } = require("octokit");
//
async function checkAndRedeliverWebhooks() {
// Get the values of environment variables that were set by the {% data variables.product.prodname_actions %} workflow.
const TOKEN = process.env.TOKEN;
const REPO_OWNER = process.env.REPO_OWNER;
const REPO_NAME = process.env.REPO_NAME;
const HOOK_ID = process.env.HOOK_ID;
const LAST_REDELIVERY_VARIABLE_NAME = process.env.LAST_REDELIVERY_VARIABLE_NAME;
{% ifversion ghes or ghae %}const HOSTNAME = process.env.HOSTNAME;{% endif %}
const WORKFLOW_REPO_NAME = process.env.WORKFLOW_REPO_NAME;
const WORKFLOW_REPO_OWNER = process.env.WORKFLOW_REPO_OWNER;
// Create an instance of `Octokit` using the token{% ifversion ghes or ghae %} and hostname{% endif %} values that were set in the {% data variables.product.prodname_actions %} workflow.
const octokit = new Octokit({ {% ifversion ghes or ghae %}
baseUrl: "{% data variables.product.api_url_code %}",{% endif %}
auth: TOKEN,
});
try {
// Get the last time that this script ran from the configuration variable. If the variable is not defined, use the current time minus 24 hours.
const lastStoredRedeliveryTime = await getVariable({
variableName: LAST_REDELIVERY_VARIABLE_NAME,
repoOwner: WORKFLOW_REPO_OWNER,
repoName: WORKFLOW_REPO_NAME,
octokit,
});
const lastWebhookRedeliveryTime = lastStoredRedeliveryTime || (Date.now() - (24 * 60 * 60 * 1000)).toString();
// Record the time that this script started redelivering webhooks.
const newWebhookRedeliveryTime = Date.now().toString();
// Get the webhook deliveries that were delivered after `lastWebhookRedeliveryTime`.
const deliveries = await fetchWebhookDeliveriesSince({
lastWebhookRedeliveryTime,
repoOwner: REPO_OWNER,
repoName: REPO_NAME,
hookId: HOOK_ID,
octokit,
});
// Consolidate deliveries that have the same globally unique identifier (GUID). The GUID is constant across redeliveries of the same delivery.
let deliveriesByGuid = {};
for (const delivery of deliveries) {
deliveriesByGuid[delivery.guid]
? deliveriesByGuid[delivery.guid].push(delivery)
: (deliveriesByGuid[delivery.guid] = [delivery]);
}
// For each GUID value, if no deliveries for that GUID have been successfully delivered within the time frame, get the delivery ID of one of the deliveries with that GUID.
//
// This will prevent duplicate redeliveries if a delivery has failed multiple times.
// This will also prevent redelivery of failed deliveries that have already been successfully redelivered.
let failedDeliveryIDs = [];
for (const guid in deliveriesByGuid) {
const deliveries = deliveriesByGuid[guid];
const anySucceeded = deliveries.some(
(delivery) => delivery.status === "OK"
);
if (!anySucceeded) {
failedDeliveryIDs.push(deliveries[0].id);
}
}
// Redeliver any failed deliveries.
for (const deliveryId of failedDeliveryIDs) {
await redeliverWebhook({
deliveryId,
repoOwner: REPO_OWNER,
repoName: REPO_NAME,
hookId: HOOK_ID,
octokit,
});
}
// Update the configuration variable (or create the variable if it doesn't already exist) to store the time that this script started.
// This value will be used next time this script runs.
await updateVariable({
variableName: LAST_REDELIVERY_VARIABLE_NAME,
value: newWebhookRedeliveryTime,
variableExists: Boolean(lastStoredRedeliveryTime),
repoOwner: WORKFLOW_REPO_OWNER,
repoName: WORKFLOW_REPO_NAME,
octokit,
});
// Log the number of redeliveries.
console.log(
`Redelivered ${
failedDeliveryIDs.length
} failed webhook deliveries out of ${
deliveries.length
} total deliveries since ${Date(lastWebhookRedeliveryTime)}.`
);
} catch (error) {
// If there was an error, log the error so that it appears in the workflow run log, then throw the error so that the workflow run registers as a failure.
if (error.response) {
console.error(
`Failed to check and redeliver webhooks: ${error.response.data.message}`
);
}
console.error(error);
throw(error);
}
}
// This function will fetch all of the webhook deliveries that were delivered since `lastWebhookRedeliveryTime`.
// It uses the `octokit.paginate.iterator()` method to iterate through paginated results. For more information, see "[AUTOTITLE](/rest/guides/scripting-with-the-rest-api-and-javascript#making-paginated-requests)."
//
// If a page of results includes deliveries that occurred before `lastWebhookRedeliveryTime`,
// it will store only the deliveries that occurred after `lastWebhookRedeliveryTime` and then stop.
// Otherwise, it will store all of the deliveries from the page and request the next page.
async function fetchWebhookDeliveriesSince({
lastWebhookRedeliveryTime,
repoOwner,
repoName,
hookId,
octokit,
}) {
const iterator = octokit.paginate.iterator(
"GET /repos/{owner}/{repo}/hooks/{hook_id}/deliveries",
{
owner: repoOwner,
repo: repoName,
hook_id: hookId,
per_page: 100,{% ifversion api-date-versioning %}
headers: {
"x-github-api-version": "{{ allVersions[currentVersion].latestApiVersion }}",
},{% endif %}
}
);
const deliveries = [];
for await (const { data } of iterator) {
const oldestDeliveryTimestamp = new Date(
data[data.length - 1].delivered_at
).getTime();
if (oldestDeliveryTimestamp < lastWebhookRedeliveryTime) {
for (const delivery of data) {
if (
new Date(delivery.delivered_at).getTime() > lastWebhookRedeliveryTime
) {
deliveries.push(delivery);
} else {
break;
}
}
break;
} else {
deliveries.push(...data);
}
}
return deliveries;
}
// This function will redeliver a failed webhook delivery.
async function redeliverWebhook({
deliveryId,
repoOwner,
repoName,
hookId,
octokit,
}) {
await octokit.request(
"POST /repos/{owner}/{repo}/hooks/{hook_id}/deliveries/{delivery_id}/attempts",
{
owner: repoOwner,
repo: repoName,
hook_id: hookId,
delivery_id: deliveryId,
}
);
}
// This function gets the value of a configuration variable.
// If the variable does not exist, the endpoint returns a 404 response and this function returns `undefined`.
async function getVariable({ variableName, repoOwner, repoName, octokit }) {
try {
const {
data: { value },
} = await octokit.request(
"GET /repos/{owner}/{repo}/actions/variables/{name}",
{
owner: repoOwner,
repo: repoName,
name: variableName,
}
);
return value;
} catch (error) {
if (error.status === 404) {
return undefined;
} else {
throw error;
}
}
}
// This function will update a configuration variable (or create the variable if it doesn't already exist). For more information, see "[AUTOTITLE](/actions/learn-github-actions/variables#defining-configuration-variables-for-multiple-workflows)."
async function updateVariable({
variableName,
value,
variableExists,
repoOwner,
repoName,
octokit,
}) {
if (variableExists) {
await octokit.request(
"PATCH /repos/{owner}/{repo}/actions/variables/{name}",
{
owner: repoOwner,
repo: repoName,
name: variableName,
value: value,
}
);
} else {
await octokit.request("POST /repos/{owner}/{repo}/actions/variables", {
owner: repoOwner,
repo: repoName,
name: variableName,
value: value,
});
}
}
// This will execute the `checkAndRedeliverWebhooks` function.
(async () => {
await checkAndRedeliverWebhooks();
})();
```
## Testing the script
You can manually trigger your workflow to test the script. For more information, see "[AUTOTITLE](/actions/using-workflows/manually-running-a-workflow)" and "[AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/using-workflow-run-logs)."
## Alternative methods
This example used {% data variables.product.prodname_actions %} to securely store credentials and to run the script on a schedule. However, if you prefer to run this script on your server that handles webhook deliveries, you can:
- Store the credentials in another secure manner, such as a secret manager like [Azure key vault](https://azure.microsoft.com/products/key-vault). You will also need to update the script to access the credentials from their new location.
- Run the script on a schedule on your server, for example by using a cron job or task scheduler.
- Update the script to store the last run time somewhere that your server can access and update. If you choose not to store the last run time as a {% data variables.product.prodname_actions %} secret, you can remove the API calls to access and update the configuration variable.

View File

@@ -0,0 +1,354 @@
---
title: Automatically redelivering failed deliveries for an organization webhook
shortTitle: Automatically redeliver for organization
intro: You can write a script to handle failed deliveries of an organization webhook.
versions:
fpt: '*'
ghes: '*'
ghae: '*'
ghec: '*'
topics:
- Webhooks
layout: inline
redirect_from:
- /webhooks/using-webhooks/creating-a-script-to-automatically-redeliver-failed-deliveries-for-an-organization-webhook
---
## About automatically redelivering failed deliveries
This article describes how to write a script to find and redeliver failed deliveries for an organization webhook. For more information about failed deliveries, see "[AUTOTITLE](/webhooks/using-webhooks/handling-failed-webhook-deliveries)."
This example shows you:
- A script that will find and redeliver failed deliveries for an organization webhook
- What credentials your script will need, and how to store the credentials securely as {% data variables.product.prodname_actions %} secrets
- A {% data variables.product.prodname_actions %} workflow that can securely access your credentials and run the script periodically
This example uses {% data variables.product.prodname_actions %}, but you can also run this script on your server that handles webhook deliveries. For more information, see "[Alternative methods](#alternative-methods)."
## Storing credentials for the script
The built in `GITHUB_TOKEN` does not have sufficient permissions to redeliver webhooks. Instead of using `GITHUB_TOKEN`, this example uses a {% data variables.product.pat_generic %}. Alternatively, instead of creating a {% data variables.product.pat_generic %}, you can create a {% data variables.product.prodname_github_app %} and use the app's credentials to create an installation access token during the {% data variables.product.prodname_actions %} workflow. For more information, see "[AUTOTITLE](/apps/creating-github-apps/authenticating-with-a-github-app/making-authenticated-api-requests-with-a-github-app-in-a-github-actions-workflow)."
{% ifversion pat-v2 %}
1. Create a {% data variables.product.pat_generic %} with the following access. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens)."
- For a {% data variables.product.pat_v2 %}:
- Set resource owner to be the organization where your webhook was created
- Grant the token access to the repository where this workflow will run
- Grant the token write access to the organization webhooks permission
- Grant the token write access to the repository variables permission
- For a {% data variables.product.pat_v1 %}, grant the token the `admin:org_hook` and `repo` scope.
{% else %}
1. Create a {% data variables.product.pat_v1 %} with the `admin:org_hook` and `repo` scope. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens)."
{% endif %}
1. Store your {% data variables.product.pat_generic %} as a {% data variables.product.prodname_actions %} secret in the repository where you want the workflow to run. For more information, see "[AUTOTITLE](/actions/security-guides/encrypted-secrets)."
## Adding a workflow that will run the script
This section demonstrates how you can use a {% data variables.product.prodname_actions %} workflow to securely access the credentials that you stored in the previous section, set environment variables, and periodically run a script to find and redeliver failed deliveries.
Copy this {% data variables.product.prodname_actions %} workflow into a YAML file in the `.github/workflows` directory in the repository where you want the workflow to run. Replace the placeholders in the `Run script` step as described below.
```yaml copy annotate
#
name: Redeliver failed webhook deliveries
# This workflow runs every 6 hours or when manually triggered.
on:
schedule:
- cron: '15 */6 * * *'
workflow_dispatch:
# This workflow will use the built in `GITHUB_TOKEN` to check out the repository contents. This grants `GITHUB_TOKEN` permission to do that.
permissions:
contents: read
#
jobs:
redeliver-failed-deliveries:
name: Redeliver failed deliveries
runs-on: ubuntu-latest
steps:
# This workflow will run a script that is stored in the repository. This step checks out the repository contents so that the workflow can access the script.
- name: Check out repo content
uses: {% data reusables.actions.action-checkout %}
# This step sets up Node.js. The script that this workflow will run uses Node.js.
- name: Setup Node.js
uses: {% data reusables.actions.action-setup-node %}
with:
node-version: '18.x'
# This step installs the octokit library. The script that this workflow will run uses the octokit library.
- name: Install dependencies
run: npm install octokit
# This step sets some environment variables, then runs a script to find and redeliver failed webhook deliveries.
# - Replace `YOUR_SECRET_NAME` with the name of the secret where you stored your {% data variables.product.pat_generic %}.
# - Replace `YOUR_ORGANIZATION_NAME` with the name of the organization where the webhook was created.
# - Replace `YOUR_HOOK_ID` with the ID of the webhook.
# - Replace `YOUR_LAST_REDELIVERY_VARIABLE_NAME` with the name that you want to use for a configuration variable that will be stored in the repository where this workflow is stored. The name can be any string that contains only alphanumeric characters and `_`, and does not start with `GITHUB_` or a number. For more information, see "[AUTOTITLE](/actions/learn-github-actions/variables#defining-configuration-variables-for-multiple-workflows)."
{% ifversion ghes or ghae %}# - Replace `YOUR_HOSTNAME` with the name of {% data variables.location.product_location %}.{% endif %}
- name: Run script
env:
TOKEN: {% raw %}${{ secrets.YOUR_SECRET_NAME }}{% endraw %}
ORGANIZATION_NAME: 'YOUR_ORGANIZATION_NAME'
HOOK_ID: 'YOUR_HOOK_ID'
LAST_REDELIVERY_VARIABLE_NAME: 'YOUR_LAST_REDELIVERY_VARIABLE_NAME'
{% ifversion ghes or ghae %}HOSTNAME: 'YOUR_HOSTNAME'{% endif %}
WORKFLOW_REPO_NAME: {% raw %}${{ github.event.repository.name }}{% endraw %}
WORKFLOW_REPO_OWNER: {% raw %}${{ github.repository_owner }}{% endraw %}
run: |
node .github/workflows/scripts/redeliver-failed-deliveries.js
```
## Adding the script
This section demonstrates how you can write a script to find and redeliver failed deliveries.
Copy this script into a file called `.github/workflows/scripts/redeliver-failed-deliveries.js` in the same repository where you saved the {% data variables.product.prodname_actions %} workflow file above.
```javascript copy annotate
// This script uses {% data variables.product.company_short %}'s Octokit SDK to make API requests. For more information, see "[AUTOTITLE](/rest/guides/scripting-with-the-rest-api-and-javascript)."
const { Octokit } = require("octokit");
//
async function checkAndRedeliverWebhooks() {
// Get the values of environment variables that were set by the {% data variables.product.prodname_actions %} workflow.
const TOKEN = process.env.TOKEN;
const ORGANIZATION_NAME = process.env.ORGANIZATION_NAME;
const HOOK_ID = process.env.HOOK_ID;
const LAST_REDELIVERY_VARIABLE_NAME = process.env.LAST_REDELIVERY_VARIABLE_NAME;
{% ifversion ghes or ghae %}const HOSTNAME = process.env.HOSTNAME;{% endif %}
const WORKFLOW_REPO_NAME = process.env.WORKFLOW_REPO_NAME;
const WORKFLOW_REPO_OWNER = process.env.WORKFLOW_REPO_OWNER;
// Create an instance of `Octokit` using the token{% ifversion ghes or ghae %} and hostname{% endif %} values that were set in the {% data variables.product.prodname_actions %} workflow.
const octokit = new Octokit({ {% ifversion ghes or ghae %}
baseUrl: "{% data variables.product.api_url_code %}",{% endif %}
auth: TOKEN,
});
try {
// Get the last time that this script ran from the configuration variable. If the variable is not defined, use the current time minus 24 hours.
const lastStoredRedeliveryTime = await getVariable({
variableName: LAST_REDELIVERY_VARIABLE_NAME,
repoOwner: WORKFLOW_REPO_OWNER,
repoName: WORKFLOW_REPO_NAME,
octokit,
});
const lastWebhookRedeliveryTime = lastStoredRedeliveryTime || (Date.now() - (24 * 60 * 60 * 1000)).toString();
// Record the time that this script started redelivering webhooks.
const newWebhookRedeliveryTime = Date.now().toString();
// Get the webhook deliveries that were delivered after `lastWebhookRedeliveryTime`.
const deliveries = await fetchWebhookDeliveriesSince({
lastWebhookRedeliveryTime,
organizationName: ORGANIZATION_NAME,
hookId: HOOK_ID,
octokit,
});
// Consolidate deliveries that have the same globally unique identifier (GUID). The GUID is constant across redeliveries of the same delivery.
let deliveriesByGuid = {};
for (const delivery of deliveries) {
deliveriesByGuid[delivery.guid]
? deliveriesByGuid[delivery.guid].push(delivery)
: (deliveriesByGuid[delivery.guid] = [delivery]);
}
// For each GUID value, if no deliveries for that GUID have been successfully delivered within the time frame, get the delivery ID of one of the deliveries with that GUID.
//
// This will prevent duplicate redeliveries if a delivery has failed multiple times.
// This will also prevent redelivery of failed deliveries that have already been successfully redelivered.
let failedDeliveryIDs = [];
for (const guid in deliveriesByGuid) {
const deliveries = deliveriesByGuid[guid];
const anySucceeded = deliveries.some(
(delivery) => delivery.status === "OK"
);
if (!anySucceeded) {
failedDeliveryIDs.push(deliveries[0].id);
}
}
// Redeliver any failed deliveries.
for (const deliveryId of failedDeliveryIDs) {
await redeliverWebhook({
deliveryId,
organizationName: ORGANIZATION_NAME,
hookId: HOOK_ID,
octokit,
});
}
// Update the configuration variable (or create the variable if it doesn't already exist) to store the time that this script started.
// This value will be used next time this script runs.
await updateVariable({
variableName: LAST_REDELIVERY_VARIABLE_NAME,
value: newWebhookRedeliveryTime,
variableExists: Boolean(lastStoredRedeliveryTime),
repoOwner: WORKFLOW_REPO_OWNER,
repoName: WORKFLOW_REPO_NAME,
octokit,
});
// Log the number of redeliveries.
console.log(
`Redelivered ${
failedDeliveryIDs.length
} failed webhook deliveries out of ${
deliveries.length
} total deliveries since ${Date(lastWebhookRedeliveryTime)}.`
);
} catch (error) {
// If there was an error, log the error so that it appears in the workflow run log, then throw the error so that the workflow run registers as a failure.
if (error.response) {
console.error(
`Failed to check and redeliver webhooks: ${error.response.data.message}`
);
}
console.error(error);
throw(error);
}
}
// This function will fetch all of the webhook deliveries that were delivered since `lastWebhookRedeliveryTime`.
// It uses the `octokit.paginate.iterator()` method to iterate through paginated results. For more information, see "[AUTOTITLE](/rest/guides/scripting-with-the-rest-api-and-javascript#making-paginated-requests)."
//
// If a page of results includes deliveries that occurred before `lastWebhookRedeliveryTime`,
// it will store only the deliveries that occurred after `lastWebhookRedeliveryTime` and then stop.
// Otherwise, it will store all of the deliveries from the page and request the next page.
async function fetchWebhookDeliveriesSince({
lastWebhookRedeliveryTime,
organizationName,
hookId,
octokit,
}) {
const iterator = octokit.paginate.iterator(
"GET /orgs/{org}/hooks/{hook_id}/deliveries",
{
org: organizationName,
hook_id: hookId,
per_page: 100,{% ifversion api-date-versioning %}
headers: {
"x-github-api-version": "{{ allVersions[currentVersion].latestApiVersion }}",
},{% endif %}
}
);
const deliveries = [];
for await (const { data } of iterator) {
const oldestDeliveryTimestamp = new Date(
data[data.length - 1].delivered_at
).getTime();
if (oldestDeliveryTimestamp < lastWebhookRedeliveryTime) {
for (const delivery of data) {
if (
new Date(delivery.delivered_at).getTime() > lastWebhookRedeliveryTime
) {
deliveries.push(delivery);
} else {
break;
}
}
break;
} else {
deliveries.push(...data);
}
}
return deliveries;
}
// This function will redeliver a failed webhook delivery.
async function redeliverWebhook({
deliveryId,
organizationName,
hookId,
octokit,
}) {
await octokit.request(
"POST /orgs/{org}/hooks/{hook_id}/deliveries/{delivery_id}/attempts",
{
org: organizationName,
hook_id: hookId,
delivery_id: deliveryId,
}
);
}
// This function gets the value of a configuration variable.
// If the variable does not exist, the endpoint returns a 404 response and this function returns `undefined`.
async function getVariable({ variableName, repoOwner, repoName, octokit }) {
try {
const {
data: { value },
} = await octokit.request(
"GET /repos/{owner}/{repo}/actions/variables/{name}",
{
owner: repoOwner,
repo: repoName,
name: variableName,
}
);
return value;
} catch (error) {
if (error.status === 404) {
return undefined;
} else {
throw error;
}
}
}
// This function will update a configuration variable (or create the variable if it doesn't already exist). For more information, see "[AUTOTITLE](/actions/learn-github-actions/variables#defining-configuration-variables-for-multiple-workflows)."
async function updateVariable({
variableName,
value,
variableExists,
repoOwner,
repoName,
octokit,
}) {
if (variableExists) {
await octokit.request(
"PATCH /repos/{owner}/{repo}/actions/variables/{name}",
{
owner: repoOwner,
repo: repoName,
name: variableName,
value: value,
}
);
} else {
await octokit.request("POST /repos/{owner}/{repo}/actions/variables", {
owner: repoOwner,
repo: repoName,
name: variableName,
value: value,
});
}
}
// This will execute the `checkAndRedeliverWebhooks` function.
(async () => {
await checkAndRedeliverWebhooks();
})();
```
## Testing the script
You can manually trigger your workflow to test the script. For more information, see "[AUTOTITLE](/actions/using-workflows/manually-running-a-workflow)" and "[AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/using-workflow-run-logs)."
## Alternative methods
This example used {% data variables.product.prodname_actions %} to securely store credentials and to run the script on a schedule. However, if you prefer to run this script on your server than handles webhook deliveries, you can:
- Store the credentials in another secure manner, such as a secret manager like [Azure key vault](https://azure.microsoft.com/products/key-vault). You will also need to update the script to access the credentials from their new location.
- Run the script on a schedule on your server, for example by using a cron job or task scheduler.
- Update the script to store the last run time somewhere that your server can access and update. If you choose not to store the last run time as a {% data variables.product.prodname_actions %} secret, you can remove the API calls to access and update the configuration variable.

View File

@@ -28,278 +28,13 @@ You can also write a script that checks for failed deliveries and attempts to re
{% ifversion fpt %}There are no API endpoints to get data about {% data variables.product.prodname_marketplace %} webhooks or {% data variables.product.prodname_sponsors %} webhooks.{% endif %}{% ifversion ghec %}There are no API endpoints to get data about {% data variables.product.prodname_marketplace %} webhooks, {% data variables.product.prodname_sponsors %} webhooks, or global webhooks.{% endif %}{% ifversion ghes or ghae %}There are no API endpoints to get data about global webhook deliveries.{% endif %}
- Look at the fetched data to see if any deliveries failed. The data for a failed delivery will have a `status` value that is not `OK`.
- Use the {% data variables.product.company_short %} REST API to redeliver any deliveries that failed. For more information, see "[AUTOTITLE](/rest/webhooks/repo-deliveries#redeliver-a-delivery-for-a-repository-webhook)," "[AUTOTITLE](/rest/orgs/webhooks#redeliver-a-delivery-for-an-organization-webhook)," and "[AUTOTITLE](/rest/apps/webhooks#redeliver-a-delivery-for-an-app-webhook)."
1. Look at the fetched data to see if any deliveries failed. The data for a failed delivery will have a `status` value that is not `OK`.
1. Use the {% data variables.product.company_short %} REST API to redeliver any deliveries that failed. For more information, see "[AUTOTITLE](/rest/webhooks/repo-deliveries#redeliver-a-delivery-for-a-repository-webhook)," "[AUTOTITLE](/rest/orgs/webhooks#redeliver-a-delivery-for-an-organization-webhook)," and "[AUTOTITLE](/rest/apps/webhooks#redeliver-a-delivery-for-an-app-webhook)."
For example scripts, see:
- "[AUTOTITLE](/webhooks/using-webhooks/creating-a-script-to-automatically-redeliver-failed-deliveries-for-a-repository-webhook)"
- "[AUTOTITLE](/webhooks/using-webhooks/creating-a-script-to-automatically-redeliver-failed-deliveries-for-an-organization-webhook)"
- "[AUTOTITLE](/webhooks/using-webhooks/creating-a-script-to-automatically-redeliver-failed-deliveries-for-a-github-app-webhook)"
If a webhook delivery fails repeatedly, you should investigate the cause. Each failed delivery will give a reason for failure. For more information, see "[AUTOTITLE](/webhooks/testing-and-troubleshooting-webhooks/troubleshooting-webhooks)."
## Example for repository webhooks
You can use {% data variables.product.prodname_actions %} to run a script periodically to find and redeliver any failed deliveries. For more information about {% data variables.product.prodname_actions %}, see "[AUTOTITLE](/actions)."
The built in `GITHUB_TOKEN` does not have sufficient permissions to redeliver webhooks. Instead of using `GITHUB_TOKEN`, this example uses a {% data variables.product.pat_generic %}. Alternatively, instead of creating a {% data variables.product.pat_generic %}, you can create a {% data variables.product.prodname_github_app %} and use the app's credentials to create an installation access token during the {% data variables.product.prodname_actions %} workflow. For more information, see "[AUTOTITLE](/apps/creating-github-apps/authenticating-with-a-github-app/making-authenticated-api-requests-with-a-github-app-in-a-github-actions-workflow)."
{% ifversion pat-v2 %}
1. Create a {% data variables.product.pat_generic %} with the following access. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens)."
- For a {% data variables.product.pat_v2 %}, grant the token:
- write access to the repository webhooks permission
- write access to the repository variables permission
- access to the repository where your webhook was created
- For a {% data variables.product.pat_v1 %}, grant the token the `repo` scope.
{% else %}
1. Create a {% data variables.product.pat_v1 %} with the `repo` scope. For more information, see "[AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens)."
{% endif %}
1. Store your {% data variables.product.pat_generic %} as a {% data variables.product.prodname_actions %} secret in the repository where you want the workflow to run. For more information, see "[AUTOTITLE](/actions/security-guides/encrypted-secrets)."
1. Copy this {% data variables.product.prodname_actions %} workflow into a YAML file in the `.github/workflows` directory in the repository where you want the workflow to run. Replace the placeholders in the `Run script` step as described below.
```yaml copy annotate
#
name: Redeliver failed webhook deliveries
# This workflow runs every 6 hours or when manually triggered.
on:
schedule:
- cron: '20 */6 * * *'
workflow_dispatch:
# This workflow will use the built in `GITHUB_TOKEN` to check out the repository contents. This grants `GITHUB_TOKEN` permission to do that.
permissions:
contents: read
#
jobs:
redeliver-failed-deliveries:
name: Redeliver failed deliveries
runs-on: ubuntu-latest
steps:
# This workflow will run a script that is stored in the repository. This step checks out the repository contents so that the workflow can access the script.
- name: Check out repo content
uses: {% data reusables.actions.action-checkout %}
# This step sets up Node.js. The script that this workflow will run uses Node.js.
- name: Setup Node.js
uses: {% data reusables.actions.action-setup-node %}
with:
node-version: '18.x'
# This step installs the octokit library. The script that this workflow will run uses the octokit library.
- name: Install dependencies
run: npm install octokit
# This step sets some environment variables, then runs a script to find and redeliver failed webhook deliveries.
# The endpoints that the script will use need the repository name, repository owner, and hook ID.
# - Replace `YOUR_SECRET_NAME` with the name of the secret that you created in the previous step.
# - Replace `YOUR_REPO_OWNER` with the owner of the repository where the webhook was created.
# - Replace `YOUR_REPO_NAME` with the name of the repository where the webhook was created.
# - Replace `YOUR_HOOK_ID` with the ID of the webhook.
# - Replace `YOUR_LAST_REDELIVERY_VARIABLE_NAME` with the name that you want to use for a configuration variable that will be stored in your repository. The name can be any string that contains only alphanumeric characters and `_` and does not start with `GITHUB_` or a number. For more information, see "[AUTOTITLE](/actions/learn-github-actions/variables#defining-configuration-variables-for-multiple-workflows)."
{% ifversion ghes or ghae %}# - Replace `YOUR_HOSTNAME` with the name of {% data variables.location.product_location %}.{% endif %}
- name: Run script
env:
TOKEN: {% raw %}${{ secrets.YOUR_SECRET_NAME }}{% endraw %}
REPO_OWNER: 'YOUR_REPO_OWNER'
REPO_NAME: 'YOUR_REPO_NAME'
HOOK_ID: 'YOUR_HOOK_ID'
LAST_REDELIVERY_VARIABLE_NAME: 'YOUR_LAST_REDELIVERY_VARIABLE_NAME'
{% ifversion ghes or ghae %}HOSTNAME: 'YOUR_HOSTNAME'{% endif %}
run: |
node .github/workflows/scripts/redeliver-failed-deliveries.js
```
1. Copy this script into a file called `.github/workflows/scripts/redeliver-failed-deliveries.js` in the same repository where you saved the {% data variables.product.prodname_actions %} workflow file above.
```javascript copy annotate
// This script uses {% data variables.product.company_short %}'s Octokit SDK to make API requests. For more information, see "[AUTOTITLE](/rest/guides/scripting-with-the-rest-api-and-javascript)."
const { Octokit } = require("octokit");
// Get the values of environment variables that were set by the {% data variables.product.prodname_actions %} workflow.
const TOKEN = process.env.TOKEN;
const REPO_OWNER = process.env.REPO_OWNER;
const REPO_NAME = process.env.REPO_NAME;
const HOOK_ID = process.env.HOOK_ID;
const LAST_REDELIVERY_VARIABLE_NAME = process.env.LAST_REDELIVERY_VARIABLE_NAME;
{% ifversion ghes or ghae %}const HOSTNAME = process.env.HOSTNAME;{% endif %}
// Create an instance of `Octokit` using the token{% ifversion ghes or ghae %} and hostname{% endif %} values that were set in the {% data variables.product.prodname_actions %} workflow.
const octokit = new Octokit({ {% ifversion ghes or ghae %}
baseUrl: "{% data variables.product.api_url_code %}",{% endif %}
auth: TOKEN,
});
//
async function checkAndRedeliverWebhooks() {
try {
// Get the last time that this script ran from the configuration variable. If the variable is not defined, use the current time minus 24 hours.
const lastStoredRedeliveryTime = await getVariable(LAST_REDELIVERY_VARIABLE_NAME);
const lastWebhookRedeliveryTime = lastStoredRedeliveryTime || `${Date.now() - (24 * 60 * 60 * 1000)}`;
// Record the time that this script started redelivering webhooks.
const newWebhookRedeliveryTime = `${Date.now()}`;
// Get the webhook deliveries that were delivered after `lastWebhookRedeliveryTime`.
const deliveries = await fetchWebhookDeliveriesSince(lastWebhookRedeliveryTime);
// Consolidate deliveries that have the same globally unique identifier (GUID). The GUID is constant across redeliveries of the same delivery.
let deliveriesByGuid = {};
for (const delivery of deliveries) {
deliveriesByGuid[delivery.guid]
? deliveriesByGuid[delivery.guid].push(delivery)
: (deliveriesByGuid[delivery.guid] = [delivery]);
}
// For each GUID value, if no deliveries for that GUID have been successfully delivered within the time frame, get the delivery ID of one of the deliveries with that GUID.
//
// This will prevent duplicate redeliveries if a delivery has failed multiple times.
// This will also prevent redelivery of failed deliveries that have already been successfully redelivered.
let failedDeliveryIDs = [];
for (const guid in deliveriesByGuid) {
const deliveries = deliveriesByGuid[guid];
const anySucceeded = deliveries.some(
(delivery) => delivery.status === "OK"
);
if (!anySucceeded) {
failedDeliveryIDs.push(deliveries[0].id);
}
}
// Redeliver any failed deliveries.
for (const id of failedDeliveryIDs) {
await redeliverWebhook(id);
}
// Update the configuration variable (or create the variable if it doesn't already exist) to store the time that this script started.
// This value will be used next time this script runs.
await updateVariable({
name: LAST_REDELIVERY_VARIABLE_NAME,
value: newWebhookRedeliveryTime,
variableExists: Boolean(lastStoredRedeliveryTime),
});
// Log the number of redeliveries.
console.log(
`Redelivered ${
failedDeliveryIDs.length
} failed webhook deliveries out of ${
deliveries.length
} total deliveries since ${Date(lastWebhookRedeliveryTime)}.`
);
} catch (error) {
if (error.response) {
console.error(
`Failed to check and redeliver webhooks: ${error.response.data.message}`
);
}
console.error(error);
}
}
// This function will fetch all of the webhook deliveries that were delivered since `lastWebhookRedeliveryTime`.
// It uses the `octokit.paginate.iterator()` method to iterate through paginated results. For more information, see "[AUTOTITLE](/rest/guides/scripting-with-the-rest-api-and-javascript#making-paginated-requests)."
//
// If a page of results includes deliveries that occurred before `lastWebhookRedeliveryTime`,
// it will store only the deliveries that occurred after `lastWebhookRedeliveryTime` and then stop.
// Otherwise, it will store all of the deliveries from the page and request the next page.
async function fetchWebhookDeliveriesSince(lastWebhookRedeliveryTime) {
const iterator = octokit.paginate.iterator(
"GET /repos/{owner}/{repo}/hooks/{hook_id}/deliveries",
{
owner: REPO_OWNER,
repo: REPO_NAME,
hook_id: HOOK_ID,
per_page: 100,{% ifversion api-date-versioning %}
headers: {
"x-github-api-version": "{{ allVersions[currentVersion].latestApiVersion }}",
},{% endif %}
}
);
const deliveries = [];
for await (const { data } of iterator) {
const oldestDeliveryTimestamp = new Date(
data[data.length - 1].delivered_at
).getTime();
if (oldestDeliveryTimestamp < lastWebhookRedeliveryTime) {
for (const delivery of data) {
if (new Date(delivery.delivered_at).getTime() > lastWebhookRedeliveryTime) {
deliveries.push(delivery);
} else {
break;
}
}
break;
} else {
deliveries.push(...data);
}
}
return deliveries;
}
// This function will redeliver a failed webhook delivery.
async function redeliverWebhook(deliveryId) {
await octokit.request("POST /repos/{owner}/{repo}/hooks/{hook_id}/deliveries/{delivery_id}/attempts", {
owner: REPO_OWNER,
repo: REPO_NAME,
hook_id: HOOK_ID,
delivery_id: deliveryId,
});
}
// This function gets the value of a configuration variable.
// If the variable does not exist, the endpoint returns a 404 response and this function returns `undefined`.
async function getVariable(variableName) {
try {
const {
data: { value },
} = await octokit.request(
"GET /repos/{owner}/{repo}/actions/variables/{name}",
{
owner: REPO_OWNER,
repo: REPO_NAME,
name: variableName,
}
);
return value;
} catch (error) {
if (error.status === 404) {
return undefined;
} else {
throw error;
}
}
}
// This function will update a configuration variable (or create the variable if it doesn't already exist). For more information, see "[AUTOTITLE](/actions/learn-github-actions/variables#defining-configuration-variables-for-multiple-workflows)."
async function updateVariable({name, value, variableExists}) {
if (variableExists) {
await octokit.request(
"PATCH /repos/{owner}/{repo}/actions/variables/{name}",
{
owner: REPO_OWNER,
repo: REPO_NAME,
name: name,
value: value,
}
);
} else {
await octokit.request("POST /repos/{owner}/{repo}/actions/variables", {
owner: REPO_OWNER,
repo: REPO_NAME,
name: name,
value: value,
});
}
}
// This will execute the `checkAndRedeliverWebhooks` function.
(async () => {
await checkAndRedeliverWebhooks();
})();
```
1. You can manually trigger your workflow to test it. For more information, see "[AUTOTITLE](/actions/using-workflows/manually-running-a-workflow)."

View File

@@ -13,9 +13,11 @@ children:
- /handling-webhook-deliveries
- /validating-webhook-deliveries
- /editing-webhooks
- /handling-failed-webhook-deliveries
- /disabling-webhooks
- /best-practices-for-using-webhooks
- /delivering-webhooks-to-private-systems
- /handling-failed-webhook-deliveries
- /automatically-redelivering-failed-deliveries-for-a-repository-webhook
- /automatically-redelivering-failed-deliveries-for-an-organization-webhook
- /automatically-redelivering-failed-deliveries-for-a-github-app-webhook
---

View File

@@ -289,6 +289,8 @@ sections:
{% data reusables.release-notes.2023-09-config-apply-timeout-hookshot-go-replicas %} [Updated: 2023-09-21]
- |
{% data reusables.release-notes.cache-replica-servers-known-issue %} [Updated: 2023-09-26]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]
deprecations:

View File

@@ -57,4 +57,6 @@ sections:
- |
After an administrator enables maintenance mode from the instance's Management Console UI using Firefox, the administrator is redirected to the Settings page, but maintenance mode is not enabled. To work around this issue, use a different browser.
- |
{% data reusables.release-notes.cache-replica-servers-known-issue %} [Updated: 2023-09-26]
{% data reusables.release-notes.cache-replica-servers-known-issue %} [Updated: 2023-09-26]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -35,3 +35,5 @@ sections:
{% data reusables.release-notes.2023-09-config-apply-timeout-hookshot-go-replicas %}
- |
After an administrator enables maintenance mode from the instance's Management Console UI using Firefox, the administrator is redirected to the Settings page, but maintenance mode is not enabled. To work around this issue, use a different browser.
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -468,6 +468,8 @@ sections:
On an instance with a GitHub Advanced Security license where secret scanning is enabled, excessive logging in `/var/log` may cause user-facing errors and degraded system performance if logs consume all free space on the volume. To prevent this issue from impacting users, monitor free space on your instance's root volume. For more information, see "[Configuring secret scanning for your appliance](/admin/code-security/managing-github-advanced-security-for-your-enterprise/configuring-secret-scanning-for-your-appliance)" and "[Monitoring your appliance](/admin/enterprise-management/monitoring-your-appliance)." If you suspect that this issue is affecting your instance and you need help, [contact GitHub Support](https://support.github.com/contact). [Updated: 2023-05-03]
- |
{% data reusables.release-notes.2023-08-mssql-replication-known-issue %} [Updated: 2023-08-24]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]
deprecations:
- heading: Unsecure algorithms disabled for administrative SSH connections

View File

@@ -63,3 +63,5 @@ sections:
On an instance with a GitHub Advanced Security license where secret scanning is enabled, excessive logging in `/var/log` may cause user-facing errors and degraded system performance if logs consume all free space on the volume. To prevent this issue from impacting users, monitor free space on your instance's root volume. For more information, see "[Configuring secret scanning for your appliance](/admin/code-security/managing-github-advanced-security-for-your-enterprise/configuring-secret-scanning-for-your-appliance)" and "[Monitoring your appliance](/admin/enterprise-management/monitoring-your-appliance)." If you suspect that this issue is affecting your instance and you need help, [contact GitHub Support](https://support.github.com/contact). [Updated: 2023-05-03]
- |
{% data reusables.release-notes.2023-08-mssql-replication-known-issue %} [Updated: 2023-08-24]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -39,3 +39,5 @@ sections:
{% data reusables.release-notes.mermaid-rendering-known-issue %}
- |
{% data reusables.release-notes.2023-08-mssql-replication-known-issue %}
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -44,3 +44,5 @@ sections:
On an instance with a GitHub Advanced Security license where secret scanning is enabled, excessive logging in `/var/log` may cause user-facing errors and degraded system performance if logs consume all free space on the volume. To prevent this issue from impacting users, monitor free space on your instance's root volume. For more information, see "[Configuring secret scanning for your appliance](/admin/code-security/managing-github-advanced-security-for-your-enterprise/configuring-secret-scanning-for-your-appliance)" and "[Monitoring your appliance](/admin/enterprise-management/monitoring-your-appliance)." If you suspect that this issue is affecting your instance and you need help, [contact GitHub Support](https://support.github.com/contact). [Updated: 2023-05-03]
- |
{% data reusables.release-notes.2023-08-mssql-replication-known-issue %} [Updated: 2023-08-24]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -55,3 +55,5 @@ sections:
{% data reusables.release-notes.mermaid-rendering-known-issue %} [Updated: 2023-08-18]
- |
{% data reusables.release-notes.2023-08-mssql-replication-known-issue %} [Updated: 2023-08-24]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -39,3 +39,5 @@ sections:
{% data reusables.release-notes.mermaid-rendering-known-issue %}
- |
{% data reusables.release-notes.2023-08-mssql-replication-known-issue %} [Updated: 2023-08-24]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -42,3 +42,5 @@ sections:
Organization owners cannot register a new SSH certificate authorities (CAs) due to an erroneous suggestion to start a trial. Organization SSH CAs configured before an upgrade to an affected version are still usable after the upgrade. Enterprise owners can can still register SSH CAs for all organizations.
- |
{% data reusables.release-notes.2023-08-mssql-replication-known-issue %} [Updated: 2023-08-24]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -102,3 +102,5 @@ sections:
{% data reusables.release-notes.migrations-missing-section-known-issue %} [Updated: 2023-08-18]
- |
{% data reusables.release-notes.2023-08-mssql-replication-known-issue %} [Updated: 2023-08-24]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -23,6 +23,8 @@ sections:
{% data reusables.release-notes.migrations-missing-section-known-issue %} [Updated: 2023-08-18]
- |
{% data reusables.release-notes.2023-08-mssql-replication-known-issue %} [Updated: 2023-08-24]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]
changes:
- |

View File

@@ -52,3 +52,5 @@ sections:
{% data reusables.release-notes.migrations-blob-storage-unconfigurable-known-issue %} [Updated: 2023-08-18]
- |
{% data reusables.release-notes.2023-08-mssql-replication-known-issue %} [Updated: 2023-08-24]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -34,3 +34,5 @@ sections:
{% data reusables.release-notes.mermaid-rendering-known-issue %}
- |
{% data reusables.release-notes.2023-08-mssql-replication-known-issue %} [Updated: 2023-09-04]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -434,6 +434,8 @@ sections:
{% data reusables.release-notes.2023-09-ephemeral-self-hosted-runners-not-auto-upgrading %} [Updated: 2023-09-29]
- |
{% data reusables.release-notes.2023-10-resource-activity-queue-not-processed %} [Updated: 2023-10-10]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]
deprecations:
# https://github.com/github/releases/issues/2826

View File

@@ -154,3 +154,5 @@ sections:
{% data reusables.release-notes.2023-09-ephemeral-self-hosted-runners-not-auto-upgrading %} [Updated: 2023-09-29]
- |
{% data reusables.release-notes.2023-10-resource-activity-queue-not-processed %} [Updated: 2023-10-10]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -69,3 +69,5 @@ sections:
{% data reusables.release-notes.2023-09-ephemeral-self-hosted-runners-not-auto-upgrading %} [Updated: 2023-09-29]
- |
{% data reusables.release-notes.2023-10-resource-activity-queue-not-processed %} [Updated: 2023-10-10]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -63,4 +63,6 @@ sections:
{% data reusables.release-notes.2023-09-ephemeral-self-hosted-runners-not-auto-upgrading %} [Updated: 2023-09-29]
- |
{% data reusables.release-notes.2023-10-resource-activity-queue-not-processed %} [Updated: 2023-10-10]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -53,3 +53,5 @@ sections:
{% data reusables.release-notes.2023-09-ephemeral-self-hosted-runners-not-auto-upgrading %} [Updated: 2023-09-29]
- |
{% data reusables.release-notes.2023-10-resource-activity-queue-not-processed %} [Updated: 2023-10-10]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -60,3 +60,5 @@ sections:
{% data reusables.release-notes.2023-09-config-apply-timeout-hookshot-go-replicas %}
- |
{% data reusables.release-notes.2023-10-resource-activity-queue-not-processed %} [Updated: 2023-10-10]
- |
{% data reusables.release-notes.2023-10-support-bundle-p-flag-not-working %} [Updated: 2023-10-13]

View File

@@ -0,0 +1 @@
When an administrator uses the `-p` flag with the `ghe-support-bundle` utility to collect data for a specific number of hours, the utility erroneously collects more logs than necessary.

385
package-lock.json generated
View File

@@ -90,7 +90,7 @@
},
"devDependencies": {
"@actions/core": "^1.10.0",
"@actions/github": "^5.0.3",
"@actions/github": "^6.0.0",
"@axe-core/playwright": "^4.7.3",
"@github/markdownlint-github": "^0.4.1",
"@graphql-inspector/core": "^5.0.0",
@@ -187,22 +187,25 @@
}
},
"node_modules/@actions/github": {
"version": "5.0.3",
"version": "6.0.0",
"resolved": "https://registry.npmjs.org/@actions/github/-/github-6.0.0.tgz",
"integrity": "sha512-alScpSVnYmjNEXboZjarjukQEzgCRmjMv6Xj47fsdnqGS73bjJNDpiiXmp8jr0UZLdUB6d9jW63IcmddUP+l0g==",
"dev": true,
"license": "MIT",
"dependencies": {
"@actions/http-client": "^2.0.1",
"@octokit/core": "^3.6.0",
"@octokit/plugin-paginate-rest": "^2.17.0",
"@octokit/plugin-rest-endpoint-methods": "^5.13.0"
"@actions/http-client": "^2.2.0",
"@octokit/core": "^5.0.1",
"@octokit/plugin-paginate-rest": "^9.0.0",
"@octokit/plugin-rest-endpoint-methods": "^10.0.0"
}
},
"node_modules/@actions/http-client": {
"version": "2.0.1",
"version": "2.2.0",
"resolved": "https://registry.npmjs.org/@actions/http-client/-/http-client-2.2.0.tgz",
"integrity": "sha512-q+epW0trjVUUHboliPb4UF9g2msf+w61b32tAkFEwL/IwP0DQWgbCMM0Hbe3e3WXSKz5VcUXbzJQgy8Hkra/Lg==",
"dev": true,
"license": "MIT",
"dependencies": {
"tunnel": "^0.0.6"
"tunnel": "^0.0.6",
"undici": "^5.25.4"
}
},
"node_modules/@ampproject/remapping": {
@@ -934,6 +937,15 @@
"node": "^12.22.0 || ^14.17.0 || >=16.0.0"
}
},
"node_modules/@fastify/busboy": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/@fastify/busboy/-/busboy-2.0.0.tgz",
"integrity": "sha512-JUFJad5lv7jxj926GPgymrWQxxjPYuJNiNjNMzqT+HiuP6Vl3dk5xzG+8sTX96np0ZAluvaMzPsjhHZ5rNuNQQ==",
"dev": true,
"engines": {
"node": ">=14"
}
},
"node_modules/@github/auto-check-element": {
"version": "5.2.0",
"license": "MIT",
@@ -1940,214 +1952,6 @@
}
},
"node_modules/@octokit/auth-token": {
"version": "2.5.0",
"dev": true,
"license": "MIT",
"dependencies": {
"@octokit/types": "^6.0.3"
}
},
"node_modules/@octokit/core": {
"version": "3.6.0",
"dev": true,
"license": "MIT",
"dependencies": {
"@octokit/auth-token": "^2.4.4",
"@octokit/graphql": "^4.5.8",
"@octokit/request": "^5.6.3",
"@octokit/request-error": "^2.0.5",
"@octokit/types": "^6.0.3",
"before-after-hook": "^2.2.0",
"universal-user-agent": "^6.0.0"
}
},
"node_modules/@octokit/core/node_modules/@octokit/graphql": {
"version": "4.8.0",
"dev": true,
"license": "MIT",
"dependencies": {
"@octokit/request": "^5.6.0",
"@octokit/types": "^6.0.3",
"universal-user-agent": "^6.0.0"
}
},
"node_modules/@octokit/core/node_modules/@octokit/request-error": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/@octokit/request-error/-/request-error-2.1.0.tgz",
"integrity": "sha512-1VIvgXxs9WHSjicsRwq8PlR2LR2x6DwsJAaFgzdi0JfJoGSO8mYI/cHJQ+9FbN21aa+DrgNLnwObmyeSC8Rmpg==",
"dev": true,
"dependencies": {
"@octokit/types": "^6.0.3",
"deprecation": "^2.0.0",
"once": "^1.4.0"
}
},
"node_modules/@octokit/endpoint": {
"version": "6.0.12",
"dev": true,
"license": "MIT",
"dependencies": {
"@octokit/types": "^6.0.3",
"is-plain-object": "^5.0.0",
"universal-user-agent": "^6.0.0"
}
},
"node_modules/@octokit/graphql": {
"version": "7.0.2",
"resolved": "https://registry.npmjs.org/@octokit/graphql/-/graphql-7.0.2.tgz",
"integrity": "sha512-OJ2iGMtj5Tg3s6RaXH22cJcxXRi7Y3EBqbHTBRq+PQAqfaS8f/236fUrWhfSn8P4jovyzqucxme7/vWSSZBX2Q==",
"dev": true,
"dependencies": {
"@octokit/request": "^8.0.1",
"@octokit/types": "^12.0.0",
"universal-user-agent": "^6.0.0"
},
"engines": {
"node": ">= 18"
}
},
"node_modules/@octokit/graphql/node_modules/@octokit/endpoint": {
"version": "9.0.1",
"resolved": "https://registry.npmjs.org/@octokit/endpoint/-/endpoint-9.0.1.tgz",
"integrity": "sha512-hRlOKAovtINHQPYHZlfyFwaM8OyetxeoC81lAkBy34uLb8exrZB50SQdeW3EROqiY9G9yxQTpp5OHTV54QD+vA==",
"dev": true,
"dependencies": {
"@octokit/types": "^12.0.0",
"is-plain-object": "^5.0.0",
"universal-user-agent": "^6.0.0"
},
"engines": {
"node": ">= 18"
}
},
"node_modules/@octokit/graphql/node_modules/@octokit/openapi-types": {
"version": "19.0.0",
"resolved": "https://registry.npmjs.org/@octokit/openapi-types/-/openapi-types-19.0.0.tgz",
"integrity": "sha512-PclQ6JGMTE9iUStpzMkwLCISFn/wDeRjkZFIKALpvJQNBGwDoYYi2fFvuHwssoQ1rXI5mfh6jgTgWuddeUzfWw==",
"dev": true
},
"node_modules/@octokit/graphql/node_modules/@octokit/request": {
"version": "8.1.2",
"resolved": "https://registry.npmjs.org/@octokit/request/-/request-8.1.2.tgz",
"integrity": "sha512-A0RJJfzjlZQwb+39eDm5UM23dkxbp28WEG4p2ueH+Q2yY4p349aRK/vcUlEuIB//ggcrHJceoYYkBP/LYCoXEg==",
"dev": true,
"dependencies": {
"@octokit/endpoint": "^9.0.0",
"@octokit/request-error": "^5.0.0",
"@octokit/types": "^12.0.0",
"is-plain-object": "^5.0.0",
"universal-user-agent": "^6.0.0"
},
"engines": {
"node": ">= 18"
}
},
"node_modules/@octokit/graphql/node_modules/@octokit/types": {
"version": "12.0.0",
"resolved": "https://registry.npmjs.org/@octokit/types/-/types-12.0.0.tgz",
"integrity": "sha512-EzD434aHTFifGudYAygnFlS1Tl6KhbTynEWELQXIbTY8Msvb5nEqTZIm7sbPEt4mQYLZwu3zPKVdeIrw0g7ovg==",
"dev": true,
"dependencies": {
"@octokit/openapi-types": "^19.0.0"
}
},
"node_modules/@octokit/openapi-types": {
"version": "11.2.0",
"dev": true,
"license": "MIT"
},
"node_modules/@octokit/plugin-paginate-rest": {
"version": "2.17.0",
"dev": true,
"license": "MIT",
"dependencies": {
"@octokit/types": "^6.34.0"
},
"peerDependencies": {
"@octokit/core": ">=2"
}
},
"node_modules/@octokit/plugin-rest-endpoint-methods": {
"version": "5.13.0",
"dev": true,
"license": "MIT",
"dependencies": {
"@octokit/types": "^6.34.0",
"deprecation": "^2.3.1"
},
"peerDependencies": {
"@octokit/core": ">=3"
}
},
"node_modules/@octokit/request": {
"version": "5.6.3",
"dev": true,
"license": "MIT",
"dependencies": {
"@octokit/endpoint": "^6.0.1",
"@octokit/request-error": "^2.1.0",
"@octokit/types": "^6.16.1",
"is-plain-object": "^5.0.0",
"node-fetch": "^2.6.7",
"universal-user-agent": "^6.0.0"
}
},
"node_modules/@octokit/request-error": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/@octokit/request-error/-/request-error-5.0.1.tgz",
"integrity": "sha512-X7pnyTMV7MgtGmiXBwmO6M5kIPrntOXdyKZLigNfQWSEQzVxR4a4vo49vJjTWX70mPndj8KhfT4Dx+2Ng3vnBQ==",
"dev": true,
"dependencies": {
"@octokit/types": "^12.0.0",
"deprecation": "^2.0.0",
"once": "^1.4.0"
},
"engines": {
"node": ">= 18"
}
},
"node_modules/@octokit/request-error/node_modules/@octokit/openapi-types": {
"version": "19.0.0",
"resolved": "https://registry.npmjs.org/@octokit/openapi-types/-/openapi-types-19.0.0.tgz",
"integrity": "sha512-PclQ6JGMTE9iUStpzMkwLCISFn/wDeRjkZFIKALpvJQNBGwDoYYi2fFvuHwssoQ1rXI5mfh6jgTgWuddeUzfWw==",
"dev": true
},
"node_modules/@octokit/request-error/node_modules/@octokit/types": {
"version": "12.0.0",
"resolved": "https://registry.npmjs.org/@octokit/types/-/types-12.0.0.tgz",
"integrity": "sha512-EzD434aHTFifGudYAygnFlS1Tl6KhbTynEWELQXIbTY8Msvb5nEqTZIm7sbPEt4mQYLZwu3zPKVdeIrw0g7ovg==",
"dev": true,
"dependencies": {
"@octokit/openapi-types": "^19.0.0"
}
},
"node_modules/@octokit/request/node_modules/@octokit/request-error": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/@octokit/request-error/-/request-error-2.1.0.tgz",
"integrity": "sha512-1VIvgXxs9WHSjicsRwq8PlR2LR2x6DwsJAaFgzdi0JfJoGSO8mYI/cHJQ+9FbN21aa+DrgNLnwObmyeSC8Rmpg==",
"dev": true,
"dependencies": {
"@octokit/types": "^6.0.3",
"deprecation": "^2.0.0",
"once": "^1.4.0"
}
},
"node_modules/@octokit/rest": {
"version": "20.0.2",
"resolved": "https://registry.npmjs.org/@octokit/rest/-/rest-20.0.2.tgz",
"integrity": "sha512-Ux8NDgEraQ/DMAU1PlAohyfBBXDwhnX2j33Z1nJNziqAfHi70PuxkFYIcIt8aIAxtRE7KVuKp8lSR8pA0J5iOQ==",
"dev": true,
"dependencies": {
"@octokit/core": "^5.0.0",
"@octokit/plugin-paginate-rest": "^9.0.0",
"@octokit/plugin-request-log": "^4.0.0",
"@octokit/plugin-rest-endpoint-methods": "^10.0.0"
},
"engines": {
"node": ">= 18"
}
},
"node_modules/@octokit/rest/node_modules/@octokit/auth-token": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/@octokit/auth-token/-/auth-token-4.0.0.tgz",
"integrity": "sha512-tY/msAuJo6ARbK6SPIxZrPBms3xPbfwBrulZe0Wtr/DIY9lje2HeV1uoebShn6mx7SjCHif6EjMvoREj+gZ+SA==",
@@ -2156,7 +1960,7 @@
"node": ">= 18"
}
},
"node_modules/@octokit/rest/node_modules/@octokit/core": {
"node_modules/@octokit/core": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/@octokit/core/-/core-5.0.1.tgz",
"integrity": "sha512-lyeeeZyESFo+ffI801SaBKmCfsvarO+dgV8/0gD8u1d87clbEdWsP5yC+dSj3zLhb2eIf5SJrn6vDz9AheETHw==",
@@ -2174,7 +1978,7 @@
"node": ">= 18"
}
},
"node_modules/@octokit/rest/node_modules/@octokit/endpoint": {
"node_modules/@octokit/endpoint": {
"version": "9.0.1",
"resolved": "https://registry.npmjs.org/@octokit/endpoint/-/endpoint-9.0.1.tgz",
"integrity": "sha512-hRlOKAovtINHQPYHZlfyFwaM8OyetxeoC81lAkBy34uLb8exrZB50SQdeW3EROqiY9G9yxQTpp5OHTV54QD+vA==",
@@ -2188,13 +1992,27 @@
"node": ">= 18"
}
},
"node_modules/@octokit/rest/node_modules/@octokit/openapi-types": {
"node_modules/@octokit/graphql": {
"version": "7.0.2",
"resolved": "https://registry.npmjs.org/@octokit/graphql/-/graphql-7.0.2.tgz",
"integrity": "sha512-OJ2iGMtj5Tg3s6RaXH22cJcxXRi7Y3EBqbHTBRq+PQAqfaS8f/236fUrWhfSn8P4jovyzqucxme7/vWSSZBX2Q==",
"dev": true,
"dependencies": {
"@octokit/request": "^8.0.1",
"@octokit/types": "^12.0.0",
"universal-user-agent": "^6.0.0"
},
"engines": {
"node": ">= 18"
}
},
"node_modules/@octokit/openapi-types": {
"version": "19.0.0",
"resolved": "https://registry.npmjs.org/@octokit/openapi-types/-/openapi-types-19.0.0.tgz",
"integrity": "sha512-PclQ6JGMTE9iUStpzMkwLCISFn/wDeRjkZFIKALpvJQNBGwDoYYi2fFvuHwssoQ1rXI5mfh6jgTgWuddeUzfWw==",
"dev": true
},
"node_modules/@octokit/rest/node_modules/@octokit/plugin-paginate-rest": {
"node_modules/@octokit/plugin-paginate-rest": {
"version": "9.0.0",
"resolved": "https://registry.npmjs.org/@octokit/plugin-paginate-rest/-/plugin-paginate-rest-9.0.0.tgz",
"integrity": "sha512-oIJzCpttmBTlEhBmRvb+b9rlnGpmFgDtZ0bB6nq39qIod6A5DP+7RkVLMOixIgRCYSHDTeayWqmiJ2SZ6xgfdw==",
@@ -2209,19 +2027,7 @@
"@octokit/core": ">=5"
}
},
"node_modules/@octokit/rest/node_modules/@octokit/plugin-request-log": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/@octokit/plugin-request-log/-/plugin-request-log-4.0.0.tgz",
"integrity": "sha512-2uJI1COtYCq8Z4yNSnM231TgH50bRkheQ9+aH8TnZanB6QilOnx8RMD2qsnamSOXtDj0ilxvevf5fGsBhBBzKA==",
"dev": true,
"engines": {
"node": ">= 18"
},
"peerDependencies": {
"@octokit/core": ">=5"
}
},
"node_modules/@octokit/rest/node_modules/@octokit/plugin-rest-endpoint-methods": {
"node_modules/@octokit/plugin-rest-endpoint-methods": {
"version": "10.0.0",
"resolved": "https://registry.npmjs.org/@octokit/plugin-rest-endpoint-methods/-/plugin-rest-endpoint-methods-10.0.0.tgz",
"integrity": "sha512-16VkwE2v6rXU+/gBsYC62M8lKWOphY5Lg4wpjYnVE9Zbu0J6IwiT5kILoj1YOB53XLmcJR+Nqp8DmifOPY4H3g==",
@@ -2236,10 +2042,10 @@
"@octokit/core": ">=5"
}
},
"node_modules/@octokit/rest/node_modules/@octokit/request": {
"version": "8.1.2",
"resolved": "https://registry.npmjs.org/@octokit/request/-/request-8.1.2.tgz",
"integrity": "sha512-A0RJJfzjlZQwb+39eDm5UM23dkxbp28WEG4p2ueH+Q2yY4p349aRK/vcUlEuIB//ggcrHJceoYYkBP/LYCoXEg==",
"node_modules/@octokit/request": {
"version": "8.1.4",
"resolved": "https://registry.npmjs.org/@octokit/request/-/request-8.1.4.tgz",
"integrity": "sha512-M0aaFfpGPEKrg7XoA/gwgRvc9MSXHRO2Ioki1qrPDbl1e9YhjIwVoHE7HIKmv/m3idzldj//xBujcFNqGX6ENA==",
"dev": true,
"dependencies": {
"@octokit/endpoint": "^9.0.0",
@@ -2252,7 +2058,48 @@
"node": ">= 18"
}
},
"node_modules/@octokit/rest/node_modules/@octokit/types": {
"node_modules/@octokit/request-error": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/@octokit/request-error/-/request-error-5.0.1.tgz",
"integrity": "sha512-X7pnyTMV7MgtGmiXBwmO6M5kIPrntOXdyKZLigNfQWSEQzVxR4a4vo49vJjTWX70mPndj8KhfT4Dx+2Ng3vnBQ==",
"dev": true,
"dependencies": {
"@octokit/types": "^12.0.0",
"deprecation": "^2.0.0",
"once": "^1.4.0"
},
"engines": {
"node": ">= 18"
}
},
"node_modules/@octokit/rest": {
"version": "20.0.2",
"resolved": "https://registry.npmjs.org/@octokit/rest/-/rest-20.0.2.tgz",
"integrity": "sha512-Ux8NDgEraQ/DMAU1PlAohyfBBXDwhnX2j33Z1nJNziqAfHi70PuxkFYIcIt8aIAxtRE7KVuKp8lSR8pA0J5iOQ==",
"dev": true,
"dependencies": {
"@octokit/core": "^5.0.0",
"@octokit/plugin-paginate-rest": "^9.0.0",
"@octokit/plugin-request-log": "^4.0.0",
"@octokit/plugin-rest-endpoint-methods": "^10.0.0"
},
"engines": {
"node": ">= 18"
}
},
"node_modules/@octokit/rest/node_modules/@octokit/plugin-request-log": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/@octokit/plugin-request-log/-/plugin-request-log-4.0.0.tgz",
"integrity": "sha512-2uJI1COtYCq8Z4yNSnM231TgH50bRkheQ9+aH8TnZanB6QilOnx8RMD2qsnamSOXtDj0ilxvevf5fGsBhBBzKA==",
"dev": true,
"engines": {
"node": ">= 18"
},
"peerDependencies": {
"@octokit/core": ">=5"
}
},
"node_modules/@octokit/types": {
"version": "12.0.0",
"resolved": "https://registry.npmjs.org/@octokit/types/-/types-12.0.0.tgz",
"integrity": "sha512-EzD434aHTFifGudYAygnFlS1Tl6KhbTynEWELQXIbTY8Msvb5nEqTZIm7sbPEt4mQYLZwu3zPKVdeIrw0g7ovg==",
@@ -2261,14 +2108,6 @@
"@octokit/openapi-types": "^19.0.0"
}
},
"node_modules/@octokit/types": {
"version": "6.34.0",
"dev": true,
"license": "MIT",
"dependencies": {
"@octokit/openapi-types": "^11.2.0"
}
},
"node_modules/@oddbird/popover-polyfill": {
"version": "0.0.10",
"license": "BSD-3-Clause"
@@ -11103,44 +10942,6 @@
"resolved": "https://registry.npmjs.org/node-addon-api/-/node-addon-api-6.1.0.tgz",
"integrity": "sha512-+eawOlIgy680F0kBzPUNFhMZGtJ1YmqM6l4+Crf4IkImjYrO/mqPwRMh352g23uIaQKFItcQ64I7KMaJxHgAVA=="
},
"node_modules/node-fetch": {
"version": "2.6.7",
"dev": true,
"license": "MIT",
"dependencies": {
"whatwg-url": "^5.0.0"
},
"engines": {
"node": "4.x || >=6.0.0"
},
"peerDependencies": {
"encoding": "^0.1.0"
},
"peerDependenciesMeta": {
"encoding": {
"optional": true
}
}
},
"node_modules/node-fetch/node_modules/tr46": {
"version": "0.0.3",
"dev": true,
"license": "MIT"
},
"node_modules/node-fetch/node_modules/webidl-conversions": {
"version": "3.0.1",
"dev": true,
"license": "BSD-2-Clause"
},
"node_modules/node-fetch/node_modules/whatwg-url": {
"version": "5.0.0",
"dev": true,
"license": "MIT",
"dependencies": {
"tr46": "~0.0.3",
"webidl-conversions": "^3.0.0"
}
},
"node_modules/node-int64": {
"version": "0.4.0",
"resolved": "https://registry.npmjs.org/node-int64/-/node-int64-0.4.0.tgz",
@@ -14835,6 +14636,18 @@
"dev": true,
"license": "MIT"
},
"node_modules/undici": {
"version": "5.25.4",
"resolved": "https://registry.npmjs.org/undici/-/undici-5.25.4.tgz",
"integrity": "sha512-450yJxT29qKMf3aoudzFpIciqpx6Pji3hEWaXqXmanbXF58LTAGCKxcJjxMXWu3iG+Mudgo3ZUfDB6YDFd/dAw==",
"dev": true,
"dependencies": {
"@fastify/busboy": "^2.0.0"
},
"engines": {
"node": ">=14.0"
}
},
"node_modules/unified": {
"version": "11.0.3",
"resolved": "https://registry.npmjs.org/unified/-/unified-11.0.3.tgz",

View File

@@ -139,7 +139,7 @@
},
"devDependencies": {
"@actions/core": "^1.10.0",
"@actions/github": "^5.0.3",
"@actions/github": "^6.0.0",
"@axe-core/playwright": "^4.7.3",
"@github/markdownlint-github": "^0.4.1",
"@graphql-inspector/core": "^5.0.0",

View File

@@ -1,671 +0,0 @@
# Scripts
## Scripts to rule them all
This directory follows the [Scripts to Rule Them All](https://githubengineering.com/scripts-to-rule-them-all/) pattern:
### [`bootstrap`](bootstrap)
Installs/updates all dependencies necessary for the docs environment. Equivalent of `npm install`.
---
### [`server`](server)
Starts the local development server. Equivalent of `npm start`.
To keep things snappy, only English and Japanese are enabled. To run the server with all languages enabled, run script/server-all-languages
---
### [`test`](test)
Runs tests. Equivalent of `npm test`.
---
## Additional scripts
### [`anonymize-branch.js`](anonymize-branch.js)
Flatten all the commits in the current branch into a single anonymized @docs-bot commit
Usage: script/anonymize-branch.js <new-commit-message> [base-branch] Example: script/anonymize-branch.js "nothing to see here" If the optional [base-branch] argument is omitted, it will default to `main`
---
### [`bookmarklets/add-pr-links.js`](bookmarklets/add-pr-links.js)
---
### [`bookmarklets/open-in-vscode.js`](bookmarklets/open-in-vscode.js)
---
### [`bookmarklets/pr-link-source.js`](bookmarklets/pr-link-source.js)
---
### [`bookmarklets/view-in-development.js`](bookmarklets/view-in-development.js)
---
### [`bookmarklets/view-in-production.js`](bookmarklets/view-in-production.js)
---
### [`check-for-node`](check-for-node)
This script is run automatically when you run the server locally. It checks whether Node.js is installed.
---
### [`check-github-github-links.js`](check-github-github-links.js)
Run this script to get all broken docs.github.com links in github/github
---
### [`content-migrations/add-early-access-tocs.js`](content-migrations/add-early-access-tocs.js)
---
### [`content-migrations/add-ghec-to-schema.js`](content-migrations/add-ghec-to-schema.js)
A one-time use script to add GHEC to the REST schema on github/github.
---
### [`content-migrations/add_mini_toc_frontmatter.js`](content-migrations/add_mini_toc_frontmatter.js)
Run this one time script to add max mini toc to rest reference documentation
---
### [`content-migrations/comment-on-open-prs.js`](content-migrations/comment-on-open-prs.js)
This script finds all open PRs from active branches that touch content files, and adds a comment with steps to run some commands. The idea is to help writers and other Hubbers update their open branches and mitigate conflicts with the main branch.
---
### [`content-migrations/convert-if-to-ifversion.js`](content-migrations/convert-if-to-ifversion.js)
Run this one-time script to convert `if <feature name>` Liquid tags to `ifversion <feature name>`.
---
### [`content-migrations/create-csv-of-short-titles.js`](content-migrations/create-csv-of-short-titles.js)
---
### [`content-migrations/move-unique-image-assets.js`](content-migrations/move-unique-image-assets.js)
---
### [`content-migrations/remove-html-comments-from-index-files.js`](content-migrations/remove-html-comments-from-index-files.js)
---
### [`content-migrations/topics-upcase.js`](content-migrations/topics-upcase.js)
---
### [`content-migrations/update-developer-site-links.js`](content-migrations/update-developer-site-links.js)
---
### [`content-migrations/update-headers.js`](content-migrations/update-headers.js)
Run this one time script to update headers for accessibility Changing H3 to H2, H4 to H3, H5 to H4, and H6 to H5
---
### [`content-migrations/update-versioning-in-files.js`](content-migrations/update-versioning-in-files.js)
---
### [`content-migrations/use-short-versions.js`](content-migrations/use-short-versions.js)
Run this script to convert long form Liquid conditionals (e.g., {% if currentVersion == "free-pro-team" %}) to the new custom tag (e.g., {% ifversion fpt %}) and also use the short names in versions frontmatter.
---
### [`copy-to-test-repo.sh`](copy-to-test-repo.sh)
---
### [`create-glossary-from-spreadsheet.js`](create-glossary-from-spreadsheet.js)
This script turns a Google Sheets CSV spreadsheet into a YAML file.
---
### [`deployment/purge-edge-cache.js`](deployment/purge-edge-cache.js)
---
### [`dev-toc/generate.js`](dev-toc/generate.js)
---
### [`dev-toc/index.js`](dev-toc/index.js)
---
### [`dev-toc/layout.html`](dev-toc/layout.html)
---
### [`domwaiter.js`](domwaiter.js)
---
### [`early-access/clone-locally`](early-access/clone-locally)
This script is run on a writer's machine to begin developing Early Access content locally.
---
### [`early-access/create-branch`](early-access/create-branch)
This script is run on a writer's machine to create an Early Access branch that matches the current docs-internal branch.
---
### [`early-access/symlink-from-local-repo.js`](early-access/symlink-from-local-repo.js)
This script is run on a writer's machine while developing Early Access content locally. You must pass the script the location of your local copy of the `github/docs-early-access` git repo as the first argument.
---
### [`early-access/update-data-and-image-paths.js`](early-access/update-data-and-image-paths.js)
This script is run on a writer's machine while developing Early Access content locally. It updates the data and image paths to either include `early-access` or remove it.
---
### [`find-orphaned-assets.js`](find-orphaned-assets.js)
Print a list of all the asset files that can't be found mentioned in any of the source files (content & code).
---
### [`get-new-dotcom-path.js`](get-new-dotcom-path.js)
Pass this script any old dotcom path (e.g., `articles/foo` or `foo.md`) and it will output the new path in the content/github directory.
---
### [`graphql/build-changelog.js`](graphql/build-changelog.js)
---
### [`graphql/update-files.js`](graphql/update-files.js)
---
### [`graphql/utils/data-filenames.json`](graphql/utils/data-filenames.json)
---
### [`graphql/utils/process-previews.js`](graphql/utils/process-previews.js)
---
### [`graphql/utils/process-schemas.js`](graphql/utils/process-schemas.js)
---
### [`graphql/utils/process-upcoming-changes.js`](graphql/utils/process-upcoming-changes.js)
---
### [`graphql/utils/schema-helpers.js`](graphql/utils/schema-helpers.js)
---
### [`helpers/action-injections.js`](helpers/action-injections.js)
---
### [`helpers/add-redirect-to-frontmatter.js`](helpers/add-redirect-to-frontmatter.js)
---
### [`helpers/get-liquid-conditionals.js`](helpers/get-liquid-conditionals.js)
---
### [`helpers/git-utils.js`](helpers/git-utils.js)
---
### [`helpers/github.js`](helpers/github.js)
---
### [`helpers/retry-on-error-test.js`](helpers/retry-on-error-test.js)
Return a function that you can use to run any code within and if it throws you get a chance to say whether to sleep + retry. Example:
async function mainFunction() { if (Math.random() > 0.9) throw new Error('too large') return 'OK' }
const errorTest = (err) => err instanceof Error && err.message.includes('too large') const config = { // all optional attempts: 3, sleepTime: 800, onError: (err, attempts) => console.warn(`Failed ${attempts} attempts`) } const ok = await retry(errorTest, mainFunction, config)
---
### [`helpers/walk-files.js`](helpers/walk-files.js)
A helper that returns an array of files for a given path and file extension.
---
### [`i18n/test-html-pages.js`](i18n/test-html-pages.js)
---
### [`kill-server-for-jest.js`](kill-server-for-jest.js)
---
### [`list-image-sizes.js`](list-image-sizes.js)
This script lists all local image files, sorted by their dimensions.
---
### [`move-category-to-product.js`](move-category-to-product.js)
Move the files from a category directory to a top-level product and add redirects.
---
### [`move-content.js`](move-content.js)
Helps you move (a.k.a. rename) a file or a folder and does what's needed with frontmatter redirect_from.
---
### [`pages-with-liquid-titles.js`](pages-with-liquid-titles.js)
This is a temporary script to visualize which pages have liquid (and conditionals) in their `title` frontmatter
---
### [`prevent-pushes-to-main.js`](prevent-pushes-to-main.js)
This script is intended to be used as a git "prepush" hook. If the current branch is main, it will exit unsuccessfully and prevent the push.
---
### [`purge-fastly`](purge-fastly)
Run this script to manually purge the Fastly cache. Note this script requires a `FASTLY_SERVICE_ID` and `FASTLY_TOKEN` in your `.env` file.
---
### [`purge-fastly-by-url.js`](purge-fastly-by-url.js)
Run this script to manually purge the Fastly cache for all language variants of a single URL or for a batch of URLs in a file. This script does not require authentication.
---
### [`reconcile-category-dirs-with-ids.js`](reconcile-category-dirs-with-ids.js)
An automated test checks for discrepancies between category directory names and slugified category titles as IDs.
If the test fails, a human needs to run this script to update the directory names and add appropriate redirects.
**This script is not currently supported on Windows.**
---
### [`reconcile-filenames-with-ids.js`](reconcile-filenames-with-ids.js)
An automated test checks for discrepancies between filenames and [autogenerated heading IDs](https://www.npmjs.com/package/remark-autolink-headings). If the test fails, a human needs to run this script to update the filenames.
**This script is not currently supported on Windows.**
---
### [`rendered-content-link-checker.js`](rendered-content-link-checker.js)
This script goes through all content and renders their HTML and from there can analyze for various flaws (e.g. broken links)
---
### [`rest/openapi-check.js`](rest/openapi-check.js)
Run this script to check if OpenAPI files can be decorated successfully.
---
### [`rest/test-open-api-schema.js`](rest/test-open-api-schema.js)
Run this script to check if OpenAPI operations match versions in content/rest operations
---
### [`rest/update-files.js`](rest/update-files.js)
Run this script to pull openAPI files from github/github, dereference them, and decorate them.
---
### [`rest/utils/create-rest-examples.js`](rest/utils/create-rest-examples.js)
---
### [`rest/utils/decorator.js`](rest/utils/decorator.js)
---
### [`rest/utils/get-body-params.js`](rest/utils/get-body-params.js)
---
### [`rest/utils/get-operations.js`](rest/utils/get-operations.js)
---
### [`rest/utils/operation-schema.js`](rest/utils/operation-schema.js)
---
### [`rest/utils/operation.js`](rest/utils/operation.js)
---
### [`rest/utils/webhook-schema.js`](rest/utils/webhook-schema.js)
---
### [`rest/utils/webhook.js`](rest/utils/webhook.js)
---
### [`search/analyze-text.js`](search/analyze-text.js)
See how a piece of text gets turned into tokens by the different analyzers. Requires that the index exists in Elasticsearch.
Example:
./src/scripts/search/analyze-text.js my words to tokenize
---
### [`search/build-records.js`](search/build-records.js)
---
### [`search/find-indexable-pages.js`](search/find-indexable-pages.js)
---
### [`search/index-elasticsearch.js`](search/index-elasticsearch.js)
Creates Elasticsearch index, populates from records, moves the index alias, deletes old indexes.
---
### [`search/parse-page-sections-into-records.js`](search/parse-page-sections-into-records.js)
---
### [`search/popular-pages.js`](search/popular-pages.js)
---
### [`search/search-index-records.js`](search/search-index-records.js)
---
### [`search/sync-search-indices.js`](search/sync-search-indices.js)
This script is run automatically via GitHub Actions on every push to `main` to generate searchable data. It can also be run manually. For more info see [contributing/search.md](contributing/search.md)
---
### [`search/sync.js`](search/sync.js)
---
### [`search/validate-records.js`](search/validate-records.js)
---
### [`server-all-languages`](server-all-languages)
Starts the local development server with all of the available languages enabled.
---
### [`server-for-jest.js`](server-for-jest.js)
---
### [`standardize-frontmatter-order.js`](standardize-frontmatter-order.js)
Run this script to standardize frontmatter fields in all content files, per the order: - title - intro - product callout - productVersion - map topic status - hidden status - layout - redirect
---
### [`start-server-for-jest.js`](start-server-for-jest.js)
---
### [`todo`](todo)
List all the TODOs in our JavaScript files and stylesheets.
---
### [`toggle-ghae-feature-flags.js`](toggle-ghae-feature-flags.js)
Find and replace lightweight feature flags for GitHub AE content.
---
### [`update-internal-links.js`](update-internal-links.js)
Run this script to find internal links in all content and data Markdown files, check if either the title or link (or both) are outdated, and automatically update them if so.
Exceptions: * Links with fragments (e.g., [Bar](/foo#bar)) will get their root links updated if necessary, but the fragment and title will be unchanged (e.g., [Bar](/noo#bar)). * Links with hardcoded versions (e.g., [Foo](/enterprise-server/baz)) will get their root links updated if necessary, but the hardcoded versions will be preserved (e.g., [Foo](/enterprise-server/qux)). * Links with Liquid in the titles will have their root links updated if necessary, but the titles will be preserved.
---
### [`update-readme.js`](update-readme.js)
This script crawls the script directory, hooks on special comment markers in each script, and adds the comment to `script/README.md`.
---

View File

@@ -1,348 +0,0 @@
#!/usr/bin/env node
import { program } from 'commander'
import fs from 'fs/promises'
import { flatten } from 'flat'
import { visit } from 'unist-util-visit'
import { fromMarkdown } from 'mdast-util-from-markdown'
import { gfmTable } from 'micromark-extension-gfm-table'
import { gfmTableFromMarkdown } from 'mdast-util-gfm-table'
import { getDataByLanguage } from '../lib/get-data.js'
import walkFiles from './helpers/walk-files.js'
import readFm from '../lib/read-frontmatter.js'
const reusablesRegex = /{% data (reusables.+?) %}/g
const justReusablesRegex = new RegExp(reusablesRegex.source)
program
.description('Run accessibility checks.')
.option('-a, --all', 'Run all heading checks.')
.option('-d, --duplicates', 'List any duplicate headings per file.')
.option('-f, --firsts', 'List any first headings in an article that are not an H2.')
.option('-l, --levels', 'List any headings that increment by more than one level.')
.option('-c, --content', 'List any headings that lack content between them.')
.option('-t, --tables', 'List every table in the content.')
.option('-p, --paths [paths...]', 'Specify filepaths to include.')
.parse(process.argv)
const opts = Object.assign({}, program.opts())
const headingChecks = ['duplicates', 'firsts', 'levels', 'content']
const tableChecks = ['tables']
const allChecks = headingChecks.concat(tableChecks)
// Some options handling.
if (opts.all) headingChecks.forEach((headingCheck) => (opts[headingCheck] = true))
const requestedChecks = Object.keys(opts).filter((key) => allChecks.includes(key))
if (!requestedChecks.length) program.help()
const requestedheadingChecks = requestedChecks.filter((requestedCheck) =>
headingChecks.includes(requestedCheck),
)
if (!opts.all && requestedheadingChecks.length) opts.all = true
console.log(`\nNotes:
* This script does not check specially rendered pages: REST, GraphQL, webhook payloads, release notes.
* The reported results may have their source in data/reusables files.`)
const checkEmoji = '🔎'
const errorEmoji = '👉'
const cleanEmoji = '✅'
const checkMsgs = {
firsts: {
checkingMsg: 'Checking for non-H2 first headings',
resultsMsg: 'pages with a first heading that is not an h2!',
},
duplicates: {
checkingMsg: 'Checking for duplicate headings',
resultsMsg: 'pages with duplicate headings!',
},
levels: {
checkingMsg: 'Checking for headings that skip levels',
resultsMsg: 'pages with headings that increment by more than one level!',
},
content: {
checkingMsg: 'Checking for headings without content in between',
resultsMsg: 'pages with headings that do not have any content in between!',
},
tables: {
checkingMsg: 'Finding all tables',
resultsMsg: 'pages with tables!',
},
}
console.log('')
// Log which checks will be running.
requestedChecks.forEach((requestedCheck) => {
console.log(`${checkEmoji} ${checkMsgs[requestedCheck].checkingMsg}...`)
})
console.log('')
const errors = []
const tables = []
// Run the checks!
await checkMarkdownPages()
// Format the results
if (opts.tables) {
const allTables = []
formatTableResults(tables, allTables)
const total = allTables.length
console.log(`Total tables found in Markdown files: ${total}`)
}
if (opts.all) {
formatHeadingErrors(errors)
}
async function checkMarkdownPages() {
const mdFiles = filterFiles(getAllFiles())
await Promise.all(
mdFiles.map(async (file) => {
const rawContents = await fs.readFile(file, 'utf8')
const { content: body } = readFm(rawContents)
const withReusables = await getReusableText(body)
const ast = getAst(withReusables)
const shortPath = file.replace(`${process.cwd()}/`, '')
if (opts.tables) {
const tableObj = createTableObj(shortPath)
getTablesFromMdast(ast, tableObj)
}
if (opts.all) {
const errorObj = createErrorObj(shortPath)
const headingNodes = getElementFromMdast('heading', ast)
const headingObjs = getheadingObjs(headingNodes)
runheadingChecks(headingObjs, ast, errorObj)
}
}),
)
}
/* HEADING CHECKS */
function runheadingChecks(headingObjs, parsed, errorObj) {
if (!headingObjs.length) return
if (opts.firsts) {
checkFirsts(headingObjs, errorObj)
}
if (opts.levels) {
checkLevels(headingObjs, errorObj)
}
if (opts.duplicates) {
checkDuplicates(headingObjs, errorObj)
}
if (opts.content) {
typeof parsed === 'function'
? checkContentInHtml(parsed, errorObj)
: checkContentInMdast(parsed, errorObj)
}
errors.push(errorObj)
}
/* VALIDATION FUNCTIONS */
function checkFirsts(headingObjs, errorObj) {
const firstHeading = headingObjs[0]
if (firstHeading.level !== 2) {
errorObj.firsts.add(cleanHeading(firstHeading))
}
}
function checkLevels(headingObjs, errorObj) {
headingObjs.forEach((headingObj, ix) => {
if (ix === 0) return
const previousIndex = ix - 1
const previousObj = headingObjs[previousIndex]
const isInvalid = headingObj.level - previousObj.level > 1
if (!isInvalid) return
errorObj.levels.add(`${cleanHeading(previousObj)}\n${cleanHeading(headingObj)}`)
})
}
function checkDuplicates(headingObjs, errorObj) {
const duplicates = headingObjs.filter((headingObj, index) => {
return headingObjs.filter(
(hObj, ix) => headingObj.text.toLowerCase() === hObj.text.toLowerCase() && index !== ix,
).length
})
if (!duplicates.length) return
const dupesString = duplicates.map((hObj) => cleanHeading(hObj)).join('\n')
errorObj.duplicates.add(dupesString)
}
function checkContentInMdast(ast, errorObj) {
const results = []
ast.children.forEach((childNode, index) => {
if (index === 0) return false
if (childNode.type === 'heading') {
const previousNodeIndex = index - 1
const previousNode = ast.children[previousNodeIndex]
if (previousNode.type === 'heading') {
results.push({
previous: getheadingObjs([previousNode]),
current: getheadingObjs([childNode]),
})
}
}
})
if (!results.length) return
results.forEach((resultObj) => {
errorObj.content.add(
`${cleanHeading(resultObj.previous[0])}\n${cleanHeading(resultObj.current[0])}`,
)
})
}
function checkContentInHtml(parsed, errorObj) {
const results = []
parsed('*').map(async (index, currentNode) => {
if (index === 0) return false
if (/h[2-6]/.test(currentNode.name)) {
const previousNodeIndex = index - 1
const previousNode = parsed('*')[previousNodeIndex]
if (/h[2-6]/.test(previousNode.name)) {
results.push({
previous: {
level: previousNode.name.replace('h', ''),
text: parsed(previousNode).text(),
},
current: {
level: currentNode.name.replace('h', ''),
text: parsed(currentNode).text(),
},
})
}
}
})
if (!results.length) return
results.forEach((resultObj) => {
errorObj.content.add(`${cleanHeading(resultObj.previous)}\n${cleanHeading(resultObj.current)}`)
})
}
/* MARKDOWN FUNCTIONS */
function getAst(doc) {
return fromMarkdown(doc, {
extensions: [gfmTable],
mdastExtensions: [gfmTableFromMarkdown],
})
}
async function getReusableText(body) {
const reusables = body.match(reusablesRegex) || []
if (!reusables.length) return body
let newBody = body
await Promise.all(
reusables.map(async (reusable) => {
const justReusable = reusable.match(justReusablesRegex)[1].trim()
const text = getDataByLanguage(justReusable, 'en')
newBody = body.replace(reusable, text)
}),
)
return newBody
}
function getElementFromMdast(element, ast) {
const elements = []
visit(ast, (node) => {
if (node.type === element) {
elements.push(node)
}
})
return elements
}
function getTablesFromMdast(ast, tableObj) {
const tableNodes = getElementFromMdast('table', ast)
if (!tableNodes.length) return
const firstRows = tableNodes.map((table) => {
const firstRow = table.children[0]
return Object.entries(flatten(firstRow))
.filter(([key, _val]) => key.endsWith('value'))
.map(([_key, val]) => val)
.join(', ')
})
tableObj.tables.push(...firstRows)
tables.push(tableObj)
}
/* SHARED UTILITIES */
function getAllFiles() {
return walkFiles('content', '.md')
}
function filterFiles(files) {
if (!opts.paths) return files
const filtered = files.filter((file) => opts.paths.some((path) => file.includes(path)))
if (!filtered.length) {
console.error(`Error! Did not find any files. Check provided paths.`)
process.exit(1)
}
return filtered
}
function getheadingObjs(headingNodes) {
return headingNodes.map((n) => {
const flatNodes = flatten(n)
const text = Object.entries(flatNodes)
.filter(([key, _val]) => key.endsWith('value'))
.map(([_key, val]) => val)
.join('')
return {
level: n.depth,
text,
}
})
}
function cleanHeading({ level, text }) {
return `${'#'.repeat(level)} ${text}`
}
/* REPORTING FUNCTIONS */
function createErrorObj(shortPath) {
return {
file: shortPath,
firsts: new Set(),
duplicates: new Set(),
levels: new Set(),
content: new Set(),
}
}
function createTableObj(shortPath) {
return {
file: shortPath,
tables: [],
}
}
function formatHeadingErrors(errors) {
requestedheadingChecks.forEach((requestedCheck) => {
const errorsPerCheck = errors.filter((errorObj) => errorObj[requestedCheck].size)
const emoji = errorsPerCheck.length ? `\n${errorEmoji}` : cleanEmoji
const msg = `${emoji} Found ${errorsPerCheck.length} ${checkMsgs[requestedCheck].resultsMsg}`
console.log(msg)
if (!errorsPerCheck.length) return
errors.forEach((errorObj) => {
errorObj[requestedCheck].forEach((error) => {
console.log('')
console.log(errorObj.file)
console.log(error)
console.log('')
})
})
})
}
function formatTableResults(tables, allTables) {
const pagesWithTables = tables.filter((tableObj) => tableObj.tables.length)
if (!pagesWithTables.length) return
console.log(`${errorEmoji} Found ${pagesWithTables.length} ${checkMsgs.tables.resultsMsg}`)
tables.forEach((tableObj) => {
console.log('')
console.log(tableObj.file)
allTables.push(tableObj.tables)
console.log(`Found ${tableObj.tables.length} tables`)
tableObj.tables.forEach((table) => console.log(`First row includes: ${table}`))
console.log('')
})
}

View File

@@ -1,35 +0,0 @@
#!/usr/bin/env node
// [start-readme]
//
// Flatten all the commits in the current branch into a single anonymized @docs-bot commit
//
// Usage: script/anonymize-branch.js <new-commit-message> [base-branch]
// Example: script/anonymize-branch.js "nothing to see here"
// If the optional [base-branch] argument is omitted, it will default to `main`
//
// [end-readme]
import { execSync as exec } from 'child_process'
import path from 'path'
process.env.GIT_AUTHOR_NAME = process.env.GIT_COMMITTER_NAME = 'Docs Bot'
process.env.GIT_AUTHOR_EMAIL = process.env.GIT_COMMITTER_EMAIL =
'63058869+docs-bot@users.noreply.github.com'
const args = process.argv.slice(2)
const message = args[0]
const base = args[1] || 'main'
if (!message || !message.length) {
console.error(
`Specify a new commit message in quotes. Example:\n\nscript/${path.basename(
module.filename,
)} "new commit"`,
)
process.exit()
}
exec(`git reset $(git merge-base ${base} HEAD)`)
exec('git add -A')
exec(`git commit -m "${message}"`)

View File

@@ -1,11 +0,0 @@
#!/usr/bin/env bash
# [start-readme]
#
# Installs/updates all dependencies necessary for the docs environment. Equivalent of `npm install`.
#
# [end-readme]
source script/check-for-node
npm install

View File

@@ -1,15 +0,0 @@
#!/usr/bin/env bash
# [start-readme]
#
# This script is run automatically when you run the server locally. It checks whether Node.js is installed.
#
# [end-readme]
if which node 2>/dev/null; then
echo "✔ Node.js is installed."
else
echo "✘ Node.js is not installed"
echo "Visit nodejs.org to download the latest LTS installer"
exit 1
fi

View File

@@ -1,5 +0,0 @@
# Content migration scripts
This directory stores scripts that modify content and/or data files. Because
writers are updating content all the time, scripts in here require more
cross-team coordination and planning before they are run. Make sure to consider if we can wait for the changes to come in through out translation automation.

View File

@@ -1,56 +0,0 @@
#!/usr/bin/env node
// [start-readme]
//
// This script finds all open PRs from active branches that touch content files, and adds a comment
// with steps to run some commands. The idea is to help writers and other Hubbers update their
// open branches and mitigate conflicts with the main branch.
//
// [end-readme]
import { listPulls, createIssueComment } from '../helpers/git-utils.js'
// check for required PAT
if (!process.env.GITHUB_TOKEN) {
console.error('Error! You must have a GITHUB_TOKEN set in an .env file to run this script.')
process.exit(1)
}
const options = {
owner: 'github',
repo: 'docs-internal',
}
const comment = `
👋 Hello! The docs-engineering team has just published an update that touches all content files in the repo. To reduce conflicts with \`main\`, we are sending out this message to help folks update their branches.
You'll need to do the following steps in Terminal. If you're not into that, ask in #docs-engineering and we'll help out!
1. Check out the branch associated with this PR. Don't update from \`main\` yet.
2. Run: \`script/content-migrations/remove-map-topics.js && script/content-migrations/update-tocs.js\`
3. Commit: \`git add . && git commit -m 'ran content migration scripts'\`
4. Update: \`git pull origin main\`
You may still have some conflicts to resolve. Feel free to ask us if you have questions or need help!
For a 5min demo of what the scripts do and why they're needed, check out [this screencast](https://www.loom.com/share/fa6501580b2a44d7a8a4357ee51e0c99).
`
main()
async function main() {
const allPulls = await listPulls(options.owner, options.repo)
// get the number of open PRs only
const openPullNumbers = allPulls
.filter((pull) => pull.state === 'open')
.map((pull) => pull.number)
// for every open PR, create a review comment
await Promise.all(
openPullNumbers.map(async (pullNumber) => {
await createIssueComment(options.owner, options.repo, pullNumber, comment)
console.log(`Added a comment to PR #${pullNumber}`)
}),
)
}

View File

@@ -1,25 +0,0 @@
#!/usr/bin/env node
import fs from 'fs'
import path from 'path'
import walk from 'walk-sync'
import readFrontmatter from '../../lib/read-frontmatter.js'
const csvFile = path.join(process.cwd(), 'shortTitles.csv')
fs.writeFileSync(csvFile, 'Product,Article Title,Short title,Relative path\n')
const files = walk(path.join(process.cwd(), 'content'), {
includeBasePath: true,
directories: false,
})
files.forEach((file) => {
const relativeFilePath = file.replace(process.cwd(), '')
const productName = relativeFilePath.split('/')[2]
const fileContent = fs.readFileSync(file, 'utf8')
const { data } = readFrontmatter(fileContent)
const { title, shortTitle } = data
if (title && !shortTitle && title.length > 25) {
fs.appendFileSync(csvFile, `"${productName}","${title}",,${relativeFilePath}\n`)
}
})

View File

@@ -1,44 +0,0 @@
#!/usr/bin/env node
import fs from 'fs'
import path from 'path'
import walk from 'walk-sync'
// iterate through enterprise images from most recent to oldest
// for each asset and move any images from /assets/enterprise,
// with file paths that don't already exist, to the /assets/images
// directory. Then the existing Markdown will just work.
async function main() {
const directories = [
path.join('assets/enterprise/3.0'),
path.join('assets/enterprise/github-ae'),
path.join('assets/enterprise/2.22'),
path.join('assets/enterprise/2.21'),
path.join('assets/enterprise/2.20'),
]
for (const directory of directories) {
const files = walk(path.join(process.cwd(), directory), {
includeBasePath: true,
directories: false,
})
for (const file of files) {
// get the /assets/images path from the enterprise asset path
const enterpriseRegex = /\/assets\/enterprise\/(2\.20|2\.21|2\.22|3\.0|github-ae)/
const existingFileToCompare = file.replace(enterpriseRegex, '')
if (!fs.existsSync(existingFileToCompare)) {
const newDirectoryName = path.dirname(existingFileToCompare)
if (!fs.existsSync(newDirectoryName)) {
fs.mkdirSync(newDirectoryName, { recursive: true })
}
fs.renameSync(file, existingFileToCompare)
}
}
}
}
main()
.catch(console.error)
.finally(() => console.log('Done!'))

View File

@@ -1,14 +0,0 @@
#!/usr/bin/env node
import fs from 'fs'
import path from 'path'
import walk from 'walk-sync'
const contentDir = path.join(process.cwd(), 'content')
// remove legacy commented out conditionals in index.md files
walk(contentDir, { includeBasePath: true, directories: false })
.filter((file) => file.endsWith('index.md'))
.forEach((file) => {
const newContents = fs.readFileSync(file, 'utf8').replace(/\n<!-- (if|endif) .*?-->/g, '')
fs.writeFileSync(file, newContents)
})

View File

@@ -1,36 +0,0 @@
#!/usr/bin/env node
import fs from 'fs'
import path from 'path'
import walk from 'walk-sync'
import readFrontmatter from '../../lib/read-frontmatter.js'
import allowTopics from '../../data/allowed-topics.js'
// key is the downcased valued for comparison
// value is the display value with correct casing
const topicLookupObject = {}
allowTopics.forEach((topic) => {
const lowerCaseTopic = topic.toLowerCase()
topicLookupObject[lowerCaseTopic] = topic
})
const files = walk(path.join(process.cwd(), 'content'), {
includeBasePath: true,
directories: false,
})
files.forEach((file) => {
const fileContent = fs.readFileSync(file, 'utf8')
const { content, data } = readFrontmatter(fileContent)
if (data.topics === undefined) return
const topics = data.topics.map((elem) => elem.toLowerCase())
const newTopics = []
topics.forEach((topic) => {
// for each topic in the markdown file, lookup the display value
// and add it to a new array
newTopics.push(topicLookupObject[topic])
})
data.topics = newTopics
const newContents = readFrontmatter.stringify(content, data, { lineWidth: 10000 })
fs.writeFileSync(file, newContents)
})

View File

@@ -1,94 +0,0 @@
#!/usr/bin/env node
// [start-readme]
//
// Run this one time script to update headers for accessibility
// Changing H3 to H2, H4 to H3, H5 to H4, and H6 to H5
//
// [end-readme]
import { fileURLToPath } from 'url'
import path from 'path'
import fs from 'fs'
import walk from 'walk-sync'
const __dirname = path.dirname(fileURLToPath(import.meta.url))
const re = /^#.*\n/gm
async function updateMdHeaders(dir) {
walk(dir, { includeBasePath: true, directories: false })
.filter((file) => !file.endsWith('README.md') && !file.includes('content/rest/reference'))
.forEach((file) => {
fs.readFile(file, 'utf8', (err, data) => {
if (err) return console.error(err)
const matchHeader = data.match(re)
let firstHeader = matchHeader ? matchHeader[0].split(' ')[0] : null
if (firstHeader) {
for (let index = 1; index < matchHeader.length; index++) {
const nextHeader = matchHeader[index].split(' ')[0]
if (nextHeader.length < firstHeader.length && nextHeader.length >= 3) {
console.log(file)
break
}
}
}
if (file.includes('data/reusables/')) {
if (
!file.endsWith('data/reusables/actions/actions-group-concurrency.md') &&
!file.endsWith('data/reusables/github-actions/actions-on-examples.md')
) {
firstHeader = 'reusable-' + firstHeader
}
}
let result
switch (firstHeader) {
case '#':
return
case '##':
return
case '###':
result = data
.replace(/^### /gm, '## ')
.replace(/^#### /gm, '### ')
.replace(/^##### /gm, '#### ')
.replace(/^###### /gm, '##### ')
break
case '####':
result = data
.replace(/^#### /gm, '## ')
.replace(/^##### /gm, '### ')
.replace(/^###### /gm, '#### ')
break
case 'reusable-####':
result = data.replace(/^#### /gm, '### ').replace(/^##### /gm, '#### ')
break
case 'reusable-#####':
result = data.replace(/^##### /gm, '#### ')
break
case '#####':
result = data.replace(/^##### /gm, '### ').replace(/^###### /gm, '#### ')
break
default:
return
}
fs.writeFile(file, result, 'utf8', function (err) {
if (err) return console.log(err)
})
})
})
}
async function main() {
const mdDirPaths = [
path.join(__dirname, '../../content'),
path.join(__dirname, '../../data/reusables'),
]
for (const dir of mdDirPaths) {
await updateMdHeaders(dir)
}
}
main()
.catch(console.error)
.finally(() => console.log('Done'))

View File

@@ -1,34 +0,0 @@
#!/usr/bin/env node
// [start-readme]
//
// This script turns a Google Sheets CSV spreadsheet into a YAML file.
//
// [end-readme]
import { fileURLToPath } from 'url'
import path from 'path'
import fs from 'fs/promises'
import yaml from 'js-yaml'
const __dirname = path.dirname(fileURLToPath(import.meta.url))
const inputFile = path.join(__dirname, '../data/glossary.yml')
const glossary = yaml.load(await fs.readFile(inputFile, 'utf8'))
console.log(glossary)
const external = []
const internal = []
glossary.forEach((term) => {
if (term.internal) {
delete term.internal
internal.push(term)
} else {
external.push(term)
}
})
await fs.writeFile(path.join(__dirname, '../data/glossaries/internal.yml'), yaml.dump(internal))
await fs.writeFile(path.join(__dirname, '../data/glossaries/external.yml'), yaml.dump(external))

View File

@@ -1,82 +0,0 @@
#!/usr/bin/env node
/* Create a new subject folder
Output looks like:
src/
xsubject/
README.md
docs/
gitkeep
lib/
middleware/
pages/
components/
stylesheets/
scripts/
tests/
*/
import fs from 'fs/promises'
import { program } from 'commander'
program
.description('Scaffold a new subject folder under the src/ directory.')
.option('-n, --name <string>', 'Name of subject.')
.parse(process.argv)
const name = program.opts().name
if (!name) {
throw new Error('No subject name provided.')
}
const src = 'src/'
const subfolders = [
'docs',
'lib',
'middleware',
'pages',
'components',
'stylesheets',
'scripts',
'tests',
]
const files = [
[
'README.md',
`# ${name.toUpperCase()}
TBD what is ${name.toUpperCase()}
## What ${name.toUpperCase()} does
TBD why is ${name.toUpperCase()} on the docs
## How ${name.toUpperCase()} works
TBD step-by-step instructions to work on ${name.toUpperCase()}
## How to work on ${name.toUpperCase()}
TBD step-by-step instructions on how to work on ${name.toUpperCase()}
## How to get help for ${name.toUpperCase()}
TBD reference material
`,
],
]
const path = `${src}${name.toLowerCase()}/`
await fs.mkdir(path)
for (const subfolder of subfolders) {
await fs.mkdir(`${path}${subfolder}/`)
await fs.writeFile(`${path}${subfolder}/gitkeep`, '')
}
for (const [file, content] of files) {
await fs.writeFile(`${path}${file}`, content)
}

View File

@@ -1,30 +0,0 @@
#!/usr/bin/env node
// [start-readme]
//
// This is a temporary script to visualize which pages have liquid
// (and conditionals) in their `title` frontmatter
//
// [end-readme]
import { loadPages } from '../lib/page-data.js'
import patterns from '../lib/patterns.js'
async function main() {
const pages = await loadPages()
const liquidPages = pages
.filter((page) => page.title && patterns.hasLiquid.test(page.title))
.map(({ relativePath, title }) => {
return { relativePath, title }
})
console.log(`\n\n${liquidPages.length} pages with liquid titles`)
console.log(JSON.stringify(liquidPages, null, 2))
const conditionalPages = liquidPages.filter((page) => page.title.includes('{% if'))
console.log(`\n\n\n\n${conditionalPages.length} pages with conditionals in their titles`)
console.log(JSON.stringify(conditionalPages, null, 2))
}
main()

View File

@@ -1,34 +0,0 @@
#!/usr/bin/env bash
# [start-readme]
#
# Run this script to manually purge the Fastly cache.
# Note this script requires a `FASTLY_SERVICE_ID` and `FASTLY_TOKEN` in your `.env` file.
#
# [end-readme]
usage()
{
echo "Error! Unable to purge the Fastly cache"
echo ""
echo "Add FASTLY_SERVICE_ID and FASTLY_TOKEN to the environment or create a .env file in the project root and set these values:"
echo ""
echo "FASTLY_SERVICE_ID=<value-goes-here>"
echo "FASTLY_TOKEN=<value-goes-here>"
exit
}
# attempt to load from .env if Fastly config is not already in ENV
if [ -z "$FASTLY_SERVICE_ID" ] || [ -z "$FASTLY_TOKEN" ]; then
# abort if .env file doesn't exist
[ -f .env ] || usage
# load config from .env
export $(cat .env | xargs)
fi
if [ -z "$FASTLY_SERVICE_ID" ] || [ -z "$FASTLY_TOKEN" ]; then
usage
else
curl -H "fastly-key: $FASTLY_TOKEN" -H "accept: application/json" -H "fastly-soft-purge: 1" -X POST "https://api.fastly.com/service/$FASTLY_SERVICE_ID/purge/all-the-things"
fi

View File

@@ -1,110 +0,0 @@
#!/usr/bin/env node
// [start-readme]
//
// Run this script to manually purge the Fastly cache
// for all language variants of a single URL or for a batch of URLs in a file. This script does
// not require authentication.
//
// [end-readme]
import fs from 'fs/promises'
import path from 'path'
import { program } from 'commander'
import { execSync } from 'child_process'
import libLanguages from '#src/languages/lib/languages.js'
import { getPathWithoutLanguage } from '../lib/path-utils.js'
const languageCodes = Object.keys(libLanguages)
const requiredUrlPrefix = 'https://docs.github.com'
const purgeCommand = 'curl -s -X PURGE -H "Fastly-Soft-Purge:1"'
program
.description(
'Purge the Fastly cache for a single URL or a batch of URLs in a file, plus all language variants of the given URL(s).',
)
.option('-s, --single <URL>', `provide a single ${requiredUrlPrefix} URL`)
.option(
'-b, --batch <FILE>',
`provide a path to a file containing a list of ${requiredUrlPrefix} URLs`,
)
.option('-d, --dry-run', 'print URLs to be purged without actually purging')
.parse(process.argv)
const singleUrl = program.opts().single
const batchFile = program.opts().batch
const dryRun = program.opts().dryRun
// verify CLI options
if (!singleUrl && !batchFile) {
console.error('error: you must specify --single <URL> or --batch <FILE>.\n')
process.exit(1)
}
if (singleUrl && !singleUrl.startsWith(requiredUrlPrefix)) {
console.error(
`error: cannot purge ${singleUrl} because URLs must start with ${requiredUrlPrefix}.\n`,
)
process.exit(1)
}
if (batchFile) {
try {
await fs.readFile(batchFile)
} catch (e) {
console.error('error: cannot find batch file.\n')
process.exit(1)
}
}
// do the purge
if (singleUrl) {
purge(singleUrl)
}
if (batchFile) {
;(await fs.readFile(batchFile, 'utf8'))
.split('\n')
.filter((line) => line !== '')
.forEach((url) => {
if (!url.startsWith(requiredUrlPrefix)) {
console.error(
`error: cannot purge ${url} because URLs must start with ${requiredUrlPrefix}.\n`,
)
process.exit(1)
}
purge(url)
})
}
function purge(url) {
getLanguageVariants(url).forEach((localizedUrl) => {
if (dryRun) {
console.log(`This is a dry run! Will purge cache for ${localizedUrl}`)
return
}
console.log(`Purging cache for ${localizedUrl}`)
const result = execSync(`${purgeCommand} ${localizedUrl}`).toString()
logStatus(result)
// purge twice to ensure referenced content on the page is updated too
const secondResult = execSync(`${purgeCommand} ${localizedUrl}`).toString()
logStatus(secondResult)
})
}
function getLanguageVariants(url) {
// for https://docs.github.com/en/foo, get https://docs.github.com/foo
const languagelessUrl = getPathWithoutLanguage(url.replace(requiredUrlPrefix, ''))
// then derive localized urls
return languageCodes.map((lc) => path.join(requiredUrlPrefix, lc, languagelessUrl))
}
function logStatus(result) {
// only log status if it's not ok
if (JSON.parse(result).status === 'ok') return
console.log(result)
}

View File

@@ -1,14 +0,0 @@
#!/usr/bin/env bash
# [start-readme]
#
# Starts the local development server. Equivalent of `npm start`.
#
# To keep things snappy, only English and Japanese are enabled.
# To run the server with all languages enabled, run script/server-all-languages
#
# [end-readme]
source script/check-for-node
npm start

View File

@@ -1,13 +0,0 @@
#!/usr/bin/env bash
# [start-readme]
#
# Starts the local development server with all of the available languages enabled.
#
# [end-readme]
source script/check-for-node
# TODO would need git clones from the language repos
npm run start-all-languages

View File

@@ -1,42 +0,0 @@
#!/usr/bin/env node
// [start-readme]
//
// Run this script to standardize frontmatter fields in all content files,
// per the order:
// - title
// - intro
// - product callout
// - productVersion
// - map topic status
// - hidden status
// - layout
// - redirect
//
// [end-readme]
import { fileURLToPath } from 'url'
import path from 'path'
import fs from 'fs'
import walk from 'walk-sync'
import matter from 'gray-matter'
import { schema } from '../lib/frontmatter.js'
const __dirname = path.dirname(fileURLToPath(import.meta.url))
const properties = Object.keys(schema.properties)
const contentDir = path.join(__dirname, '../content')
const contentFiles = walk(contentDir, { includeBasePath: true }).filter(
(relativePath) => relativePath.endsWith('.md') && !relativePath.includes('README'),
)
contentFiles.forEach((fullPath) => {
const { content, data } = matter(fs.readFileSync(fullPath, 'utf8'))
const newData = {}
properties.forEach((prop) => {
if (data[prop]) newData[prop] = data[prop]
})
fs.writeFileSync(fullPath, matter.stringify(content, newData, { lineWidth: 10000 }))
})

View File

@@ -1,11 +0,0 @@
#!/usr/bin/env bash
# [start-readme]
#
# Runs tests. Equivalent of `npm test`.
#
# [end-readme]
source script/check-for-node
npm test

View File

@@ -1,15 +0,0 @@
#!/usr/bin/env bash
# [start-readme]
#
# List all the TODOs in our JavaScript files and stylesheets.
#
# [end-readme]
echo "JavaScript TODOs"
grep -R TODO . --include "*.js" --exclude-dir=node_modules
echo
echo
echo "Stylesheet TODOs"
grep -R TODO . --include "*.*css" --exclude-dir=node_modules

View File

@@ -1,91 +0,0 @@
#!/usr/bin/env node
// [start-readme]
//
// This script crawls the script directory, hooks on special comment markers
// in each script, and adds the comment to `script/README.md`.
//
// [end-readme]
import { fileURLToPath } from 'url'
import path from 'path'
import fs from 'fs/promises'
import walk from 'walk-sync'
import dedent from 'dedent'
import { difference } from 'lodash-es'
const __dirname = path.dirname(fileURLToPath(import.meta.url))
const readme = path.join(__dirname, 'README.md')
const startComment = 'start-readme'
const endComment = 'end-readme'
const startCommentRegex = new RegExp(startComment)
const endCommentRegex = new RegExp(endComment)
const ignoreList = ['README.md']
const scriptsToRuleThemAll = ['bootstrap', 'server', 'test']
const allScripts = walk(__dirname, { directories: false }).filter((script) =>
ignoreList.every((ignoredPath) => !script.includes(ignoredPath)),
)
const otherScripts = difference(allScripts, scriptsToRuleThemAll)
// build an object with script name as key and readme comment as value
const allComments = {}
for (const script of allScripts) {
const fullPath = path.join(__dirname, script)
let addToReadme = false
const readmeComment = (await fs.readFile(fullPath, 'utf8'))
.split('\n')
.filter((cmt) => {
if (startCommentRegex.test(cmt)) addToReadme = true
if (endCommentRegex.test(cmt)) addToReadme = false
if (addToReadme && !cmt.includes(startComment) && !cmt.includes(endComment)) return cmt
return false
})
// remove comment markers and clean up newlines
.map((cmt) => cmt.replace(/^(\/\/|#) ?/m, ''))
.join('\n')
.trim()
allComments[script] = readmeComment
// preserve double newlines as multiline list items
.replace(/\n\n/g, '\n\n\n ')
// remove single newlines
.replace(/\n(?!\n)/g, ' ')
}
// turn the script names/comments into itemized lists in the README
const template = `# Scripts
## Scripts to rule them all
This directory follows the [Scripts to Rule Them All](https://githubengineering.com/scripts-to-rule-them-all/) pattern:
${createTemplate(scriptsToRuleThemAll)}
## Additional scripts
${createTemplate(otherScripts)}
`
// update the readme
if (template === (await fs.readFile(readme, 'utf8'))) {
console.log('The README is up-to-date!')
} else {
await fs.writeFile(readme, template)
console.log('The README.md has been updated!')
}
function createTemplate(arrayOfScripts) {
return arrayOfScripts
.map((script) => {
const comment = allComments[script]
return dedent`### [\`${script}\`](${script})\n\n${comment}\n\n---\n\n`
})
.join('\n')
}

View File

@@ -1,71 +0,0 @@
#!/usr/bin/env node
// [start-readme]
//
// This script creates or updates an index.md file for a given directory.
// It will add `children` frontmatter in alphabetical order and create versions: { fpt: '*', ghes: '*', ghae: '*', ghec: '*' }.
// It also prints a helpful message to update those values manually if needed.
//
// [end-readme]
import fs from 'fs'
import path from 'path'
import { sentenceCase } from 'change-case'
import { program } from 'commander'
import readFrontmatter from '../lib/read-frontmatter.js'
program
.description('Create or update an index.md file for a provided content directory')
.requiredOption('-d, --directory <content directory>')
.parse(process.argv)
const directory = path.posix.join(process.cwd(), program.opts().directory)
if (!fs.existsSync(directory)) {
console.error(`Error! ${directory} not found. Make sure directory name starts with "content/".`)
process.exit(1)
}
// Run it! This function may run recursively.
updateOrCreateToc(directory)
console.log(
'Done! Review the new or updated index.md files and update the 1) order of the children 2) versions as needed',
)
function updateOrCreateToc(directory) {
const children = fs.readdirSync(directory).filter((subpath) => !subpath.endsWith('index.md'))
if (!children.length) return
const tocFile = path.posix.join(directory, 'index.md')
let content, data
// If the index.md file already exists, read it (to be updated later).
if (fs.existsSync(tocFile)) {
const parsed = readFrontmatter(fs.readFileSync(tocFile, 'utf8'))
content = parsed.content
data = parsed.data
}
// If the index.md file does not exist, create it.
else {
content = ''
data = {
title: sentenceCase(path.basename(directory)), // fake the title of the index.md from the directory name
versions: { fpt: '*', ghes: '*', ghae: '*', ghec: '*' }, // default to all versions
}
}
// Add the children - this will default to the alphabetical list of files in the directory.
data.children = children.map((child) => `/${child.replace('.md', '')}`)
// Write the file.
const newContents = readFrontmatter.stringify(content, data, { lineWidth: 10000 })
fs.writeFileSync(tocFile, newContents)
// Process any child directories recursively.
children.forEach((child) => {
if (child.endsWith('.md')) return
updateOrCreateToc(path.posix.join(directory, child))
})
}

View File

@@ -8,8 +8,6 @@ We used to organize our code more by role. Client, stylesheets, server middlewar
## How to create and use subject folders
Run `script/create-subject.js --name x` to create a new subject folder.
Subjects do not need every element below. Not every element needs to be a folder. A subject folder looks like:
```

View File

@@ -60,5 +60,5 @@
"2022-11-28"
]
},
"sha": "0590bfae7149f3fee6429d1e031646d79abb8f9a"
"sha": "c86f07e1ca0d543d0b8fc7591991b02767e02deb"
}

View File

@@ -1,6 +1,5 @@
import cx from 'classnames'
import { Link } from 'components/Link'
import { MarkdownContent } from 'components/ui/MarkdownContent'
import { GHAEReleaseNotesContextT } from './types'
import { GHAEReleaseNotePatch } from './GHAEReleaseNotePatch'
@@ -30,9 +29,9 @@ export function GHAEReleaseNotes({ context }: Props) {
{releases.map((release) => {
return (
<li key={release.version} className="my-2 px-3 f4 d-inline-block d-md-block">
<Link className="text-underline" href={`#${release.version}`}>
<a href={`#${release.version}`} className="text-underline">
{release.version}
</Link>
</a>
</li>
)
})}

View File

@@ -1,6 +1,5 @@
import cx from 'classnames'
import { Link } from 'components/Link'
import { MarkdownContent } from 'components/ui/MarkdownContent'
import { GHESReleaseNotesContextT } from './types'
import { GHESReleaseNotePatch } from './GHESReleaseNotePatch'
@@ -34,9 +33,9 @@ export function GHESReleaseNotes({ context }: Props) {
{currentRelease.patches.map((patch) => {
return (
<li key={patch.version} className="my-2 px-3 f4 d-inline-block d-md-block">
<Link className="text-underline" href={`#${patch.version}`}>
<a href={`#${patch.version}`} className="text-underline">
{patch.version}
</Link>
</a>
</li>
)
})}

View File

@@ -153505,7 +153505,7 @@
"created_at": "2022-09-12T12:14:32Z",
"updated_at": "2022-09-12T12:14:32Z",
"url": "https://api.github.com/repos/octocat/Hello-World/code-scanning/codeql/databases/java",
"commit_oid": 12345678901234567000
"commit_oid": "1927de39fefa25a9d0e64e3f540ff824a72f538c"
},
{
"id": 2,
@@ -153536,7 +153536,7 @@
"created_at": "2022-09-12T12:14:32Z",
"updated_at": "2022-09-12T12:14:32Z",
"url": "https://api.github.com/repos/octocat/Hello-World/code-scanning/codeql/databases/ruby",
"commit_oid": 23456789012345680000
"commit_oid": "1927de39fefa25a9d0e64e3f540ff824a72f538c"
}
],
"schema": {
@@ -153870,7 +153870,7 @@
"created_at": "2022-09-12T12:14:32Z",
"updated_at": "2022-09-12T12:14:32Z",
"url": "https://api.github.com/repos/octocat/Hello-World/code-scanning/codeql/databases/java",
"commit_oid": 12345678901234567000
"commit_oid": "1927de39fefa25a9d0e64e3f540ff824a72f538c"
},
"schema": {
"title": "CodeQL Database",

View File

@@ -165214,7 +165214,7 @@
"created_at": "2022-09-12T12:14:32Z",
"updated_at": "2022-09-12T12:14:32Z",
"url": "https://api.github.com/repos/octocat/Hello-World/code-scanning/codeql/databases/java",
"commit_oid": 12345678901234567000
"commit_oid": "1927de39fefa25a9d0e64e3f540ff824a72f538c"
},
{
"id": 2,
@@ -165245,7 +165245,7 @@
"created_at": "2022-09-12T12:14:32Z",
"updated_at": "2022-09-12T12:14:32Z",
"url": "https://api.github.com/repos/octocat/Hello-World/code-scanning/codeql/databases/ruby",
"commit_oid": 23456789012345680000
"commit_oid": "1927de39fefa25a9d0e64e3f540ff824a72f538c"
}
],
"schema": {
@@ -165579,7 +165579,7 @@
"created_at": "2022-09-12T12:14:32Z",
"updated_at": "2022-09-12T12:14:32Z",
"url": "https://api.github.com/repos/octocat/Hello-World/code-scanning/codeql/databases/java",
"commit_oid": 12345678901234567000
"commit_oid": "1927de39fefa25a9d0e64e3f540ff824a72f538c"
},
"schema": {
"title": "CodeQL Database",

View File

@@ -36,5 +36,5 @@
]
}
},
"sha": "0590bfae7149f3fee6429d1e031646d79abb8f9a"
"sha": "c86f07e1ca0d543d0b8fc7591991b02767e02deb"
}

View File

@@ -1,3 +1,3 @@
{
"sha": "0590bfae7149f3fee6429d1e031646d79abb8f9a"
"sha": "c86f07e1ca0d543d0b8fc7591991b02767e02deb"
}