Merge branch 'main' into martin389-patch-1
@@ -5,6 +5,7 @@ This repository contains the documentation website code and Markdown source file
|
||||
GitHub's Docs team works on pre-production content in a private repo that regularly syncs with this public repo.
|
||||
|
||||
In this article:
|
||||
|
||||
- [Contributing](#contributing)
|
||||
- [READMEs](#readmes)
|
||||
- [License](#license)
|
||||
@@ -34,6 +35,7 @@ If you have a solution to one of the open issues, you will need to fork the repo
|
||||
We use GitHub Discussions to talk about all sorts of topics related to documentation and this site. For example: if you'd like help troubleshooting a PR, have a great new idea, or want to share something amazing you've learned in our docs, join us in [discussions](https://github.com/github/docs/discussions).
|
||||
|
||||
#### And that's it!
|
||||
|
||||
That's how you can get started easily as a member of the GitHub Documentation community. :sparkles:
|
||||
|
||||
If you want to know more, or you're making a more complex contribution, check out [Getting Started with Contributing](/CONTRIBUTING.md).
|
||||
|
||||
|
Before Width: | Height: | Size: 68 KiB After Width: | Height: | Size: 49 KiB |
|
Before Width: | Height: | Size: 68 KiB After Width: | Height: | Size: 76 KiB |
|
Before Width: | Height: | Size: 70 KiB After Width: | Height: | Size: 46 KiB |
|
Before Width: | Height: | Size: 39 KiB After Width: | Height: | Size: 21 KiB |
|
Before Width: | Height: | Size: 82 KiB After Width: | Height: | Size: 39 KiB |
@@ -18,6 +18,8 @@ Service providers can partner with {% data variables.product.company_short %} to
|
||||
|
||||
### About {% data variables.product.prodname_secret_scanning %} for public repositories
|
||||
|
||||
{% data variables.product.prodname_secret_scanning_caps %} is automatically enabled on public repositories, where it scans code for secrets, to check for known secret formats. When a match of your secret format is found in a public repository, {% data variables.product.company_short %} doesn't publicly disclose the information as an alert, but instead sends a payload to an HTTP endpoint of your choice. For an overview of how secret scanning works on public repositories, see "[Secret scanning](/developers/overview/secret-scanning)."
|
||||
|
||||
When you push to a public repository, {% data variables.product.product_name %} scans the content of the commits for secrets. If you switch a private repository to public, {% data variables.product.product_name %} scans the entire repository for secrets.
|
||||
|
||||
When {% data variables.product.prodname_secret_scanning %} detects a set of credentials, we notify the service provider who issued the secret. The service provider validates the credential and then decides whether they should revoke the secret, issue a new secret, or reach out to you directly, which will depend on the associated risks to you or the service provider.
|
||||
@@ -65,6 +67,8 @@ When {% data variables.product.prodname_secret_scanning %} detects a set of cred
|
||||
|
||||
{% data reusables.secret-scanning.beta %}
|
||||
|
||||
If you're a repository administrator or an organization owner, you can enable {% data variables.product.prodname_secret_scanning %} for private repositories that are owned by organizations. You can enable {% data variables.product.prodname_secret_scanning %} for all your repositories, or for all new repositories within your organization. {% data variables.product.prodname_secret_scanning_caps %} is not available for user account-owned private repositories. For more information, see "[Managing security and analysis settings for your repository](/github/administering-a-repository/managing-security-and-analysis-settings-for-your-repository)" and "[Managing security and analysis settings for your organization](/github/setting-up-and-managing-organizations-and-teams/managing-security-and-analysis-settings-for-your-organization)."
|
||||
|
||||
When you push commits to a private repository with {% data variables.product.prodname_secret_scanning %} enabled, {% data variables.product.product_name %} scans the contents of the commits for secrets.
|
||||
|
||||
When {% data variables.product.prodname_secret_scanning %} detects a secret in a private repository, {% data variables.product.prodname_dotcom %} sends alerts.
|
||||
@@ -73,6 +77,8 @@ When {% data variables.product.prodname_secret_scanning %} detects a secret in a
|
||||
|
||||
- {% data variables.product.prodname_dotcom %} displays an alert in the repository. For more information, see "[Managing alerts from {% data variables.product.prodname_secret_scanning %}](/github/administering-a-repository/managing-alerts-from-secret-scanning)."
|
||||
|
||||
Repository administrators and organization owners can grant users and team access to {% data variables.product.prodname_secret_scanning %} alerts. For more information, see "[Managing security and analysis settings for your repository](/github/administering-a-repository/managing-security-and-analysis-settings-for-your-repository#granting-access-to-security-alerts)."
|
||||
|
||||
{% data variables.product.product_name %} currently scans private repositories for secrets issued by the following service providers.
|
||||
|
||||
- Adafruit
|
||||
|
||||
@@ -23,31 +23,31 @@ versions:
|
||||
4. Under "Configure security and analysis features", to the right of the feature, click **Disable** or **Enable**.
|
||||

|
||||
|
||||
### Granting access to {% data variables.product.prodname_dependabot_alerts %}
|
||||
### Granting access to security alerts
|
||||
|
||||
After you enable {% data variables.product.prodname_dependabot_alerts %} for a repository in an organization, organization owners and repository administrators can view the alerts by default. You can give additional teams and people access to the alerts for a repository.
|
||||
After you enable {% data variables.product.prodname_dependabot %} or {% data variables.product.prodname_secret_scanning %} alerts for a repository in an organization, organization owners and repository administrators can view the alerts by default. You can give additional teams and people access to the alerts for a repository.
|
||||
|
||||
{% note %}
|
||||
|
||||
Organization owners and repository administrators can only grant access to view {% data variables.product.prodname_dependabot_alerts %} to people or teams who have write access to the repo.
|
||||
Organization owners and repository administrators can only grant access to view security alerts, such as {% data variables.product.prodname_dependabot %} and {% data variables.product.prodname_secret_scanning %} alerts, to people or teams who have write access to the repo.
|
||||
|
||||
{% endnote %}
|
||||
|
||||
{% data reusables.repositories.navigate-to-repo %}
|
||||
{% data reusables.repositories.sidebar-settings %}
|
||||
{% data reusables.repositories.navigate-to-security-and-analysis %}
|
||||
4. Under "Dependabot alerts", in the search field, start typing the name of the person or team you'd like to find, then click a name in the list of matches.
|
||||

|
||||
4. Under "Access to alerts", in the search field, start typing the name of the person or team you'd like to find, then click a name in the list of matches.
|
||||

|
||||
5. Click **Save changes**.
|
||||

|
||||

|
||||
|
||||
### Removing access to {% data variables.product.prodname_dependabot_alerts %}
|
||||
### Removing access to security alerts
|
||||
|
||||
{% data reusables.repositories.navigate-to-repo %}
|
||||
{% data reusables.repositories.sidebar-settings %}
|
||||
{% data reusables.repositories.navigate-to-security-and-analysis %}
|
||||
4. Under "Dependabot alerts", to the right of the person or team whose access you'd like to remove, click {% octicon "x" aria-label="X symbol" %}.
|
||||

|
||||
4. Under "Access to alerts", to the right of the person or team whose access you'd like to remove, click {% octicon "x" aria-label="X symbol" %}.
|
||||

|
||||
|
||||
### Further reading
|
||||
|
||||
|
||||
@@ -71,7 +71,7 @@ When {% data variables.product.product_name %} identifies a vulnerable dependenc
|
||||
You can see all of the alerts that affect a particular project{% if currentVersion == "free-pro-team@latest" %} on the repository's Security tab or{% endif %} in the repository's dependency graph.{% if currentVersion == "free-pro-team@latest" %} For more information, see "[Viewing and updating vulnerable dependencies in your repository](/articles/viewing-and-updating-vulnerable-dependencies-in-your-repository)."{% endif %}
|
||||
|
||||
{% if currentVersion == "free-pro-team@latest" or currentVersion ver_gt "enterprise-server@2.21" %}
|
||||
By default, we notify people with admin permissions in the affected repositories about new {% data variables.product.prodname_dependabot_alerts %}.{% endif %} {% if currentVersion == "free-pro-team@latest" %}{% data variables.product.product_name %} never publicly discloses identified vulnerabilities for any repository. You can also make {% data variables.product.prodname_dependabot_alerts %} visible to additional people or teams working repositories that you own or have admin permissions for. For more information, see "[Managing security and analysis settings for your repository](/github/administering-a-repository/managing-security-and-analysis-settings-for-your-repository#granting-access-to-dependabot-alerts)."
|
||||
By default, we notify people with admin permissions in the affected repositories about new {% data variables.product.prodname_dependabot_alerts %}.{% endif %} {% if currentVersion == "free-pro-team@latest" %}{% data variables.product.product_name %} never publicly discloses identified vulnerabilities for any repository. You can also make {% data variables.product.prodname_dependabot_alerts %} visible to additional people or teams working repositories that you own or have admin permissions for. For more information, see "[Managing security and analysis settings for your repository](/github/administering-a-repository/managing-security-and-analysis-settings-for-your-repository#granting-access-to-security-alerts)."
|
||||
{% endif %}
|
||||
|
||||
{% if enterpriseServerVersions contains currentVersion and currentVersion ver_lt "enterprise-server@2.22" %}
|
||||
|
||||
@@ -474,7 +474,7 @@ For more information, see "[Restricting publication of {% data variables.product
|
||||
|
||||
| Action | Description
|
||||
|------------------|-------------------
|
||||
| `authorized_users_teams` | Triggered when an organization owner or a person with admin permissions to the repository updates the list of people or teams authorized to receive {% data variables.product.prodname_dependabot_alerts %} for vulnerable dependencies in the repository. For more information, see "[Managing security and analysis settings for your repository](/github/administering-a-repository/managing-security-and-analysis-settings-for-your-repository#granting-access-to-dependabot-alerts)."
|
||||
| `authorized_users_teams` | Triggered when an organization owner or a person with admin permissions to the repository updates the list of people or teams authorized to receive {% data variables.product.prodname_dependabot_alerts %} for vulnerable dependencies in the repository. For more information, see "[Managing security and analysis settings for your repository](/github/administering-a-repository/managing-security-and-analysis-settings-for-your-repository#granting-access-to-security-alerts)."
|
||||
| `disable` | Triggered when a repository owner or person with admin access to the repository disables {% data variables.product.prodname_dependabot_alerts %}.
|
||||
| `enable` | Triggered when a repository owner or person with admin access to the repository enables {% data variables.product.prodname_dependabot_alerts %}.
|
||||
|
||||
|
||||
@@ -18708,6 +18708,11 @@ type Organization implements Actor & MemberStatusable & Node & PackageOwner & Pr
|
||||
hasSponsorsListing: Boolean!
|
||||
id: ID!
|
||||
|
||||
"""
|
||||
The interaction ability settings for this organization.
|
||||
"""
|
||||
interactionAbility: RepositoryInteractionAbility
|
||||
|
||||
"""
|
||||
The setting value for whether the organization has an IP allow list enabled.
|
||||
"""
|
||||
@@ -28857,6 +28862,11 @@ type Repository implements Node & PackageOwner & ProjectOwner & RepositoryInfo &
|
||||
homepageUrl: URI
|
||||
id: ID!
|
||||
|
||||
"""
|
||||
The interaction ability settings for this repository.
|
||||
"""
|
||||
interactionAbility: RepositoryInteractionAbility
|
||||
|
||||
"""
|
||||
Indicates if the repository is unmaintained.
|
||||
"""
|
||||
@@ -30082,6 +30092,71 @@ interface RepositoryInfo {
|
||||
usesCustomOpenGraphImage: Boolean!
|
||||
}
|
||||
|
||||
"""
|
||||
Repository interaction limit that applies to this object.
|
||||
"""
|
||||
type RepositoryInteractionAbility {
|
||||
"""
|
||||
The time the currently active limit expires.
|
||||
"""
|
||||
expiresAt: DateTime
|
||||
|
||||
"""
|
||||
The current limit that is enabled on this object.
|
||||
"""
|
||||
limit: RepositoryInteractionLimit!
|
||||
|
||||
"""
|
||||
The origin of the currently active interaction limit.
|
||||
"""
|
||||
origin: RepositoryInteractionLimitOrigin!
|
||||
}
|
||||
|
||||
"""
|
||||
A repository interaction limit.
|
||||
"""
|
||||
enum RepositoryInteractionLimit {
|
||||
"""
|
||||
Users that are not collaborators will not be able to interact with the repository.
|
||||
"""
|
||||
COLLABORATORS_ONLY
|
||||
|
||||
"""
|
||||
Users that have not previously committed to a repository’s default branch will be unable to interact with the repository.
|
||||
"""
|
||||
CONTRIBUTORS_ONLY
|
||||
|
||||
"""
|
||||
Users that have recently created their account will be unable to interact with the repository.
|
||||
"""
|
||||
EXISTING_USERS
|
||||
|
||||
"""
|
||||
No interaction limits are enabled.
|
||||
"""
|
||||
NO_LIMIT
|
||||
}
|
||||
|
||||
"""
|
||||
Indicates where an interaction limit is configured.
|
||||
"""
|
||||
enum RepositoryInteractionLimitOrigin {
|
||||
"""
|
||||
A limit that is configured at the organization level.
|
||||
"""
|
||||
ORGANIZATION
|
||||
|
||||
"""
|
||||
A limit that is configured at the repository level.
|
||||
"""
|
||||
REPOSITORY
|
||||
|
||||
"""
|
||||
A limit that is configured at the user-wide level.
|
||||
"""
|
||||
USER
|
||||
}
|
||||
|
||||
"""
|
||||
An invitation for a user to be added to a repository.
|
||||
"""
|
||||
@@ -37900,6 +37975,11 @@ type User implements Actor & Node & PackageOwner & ProfileOwner & ProjectOwner &
|
||||
): Hovercard!
|
||||
id: ID!
|
||||
|
||||
"""
|
||||
The interaction ability settings for this user.
|
||||
"""
|
||||
interactionAbility: RepositoryInteractionAbility
|
||||
|
||||
"""
|
||||
Whether or not this user is a participant in the GitHub Security Bug Bounty.
|
||||
"""
|
||||
|
||||
@@ -9,7 +9,7 @@
|
||||
{% data ui.helpfulness.able_to_find %}
|
||||
</h4>
|
||||
<p class="f6">
|
||||
<a href="/site-policy/github-privacy-statement">Privacy policy</a>
|
||||
<a href="/github/site-policy/github-privacy-statement">Privacy policy</a>
|
||||
</p>
|
||||
<p
|
||||
class="radio-group"
|
||||
|
||||
@@ -125,7 +125,7 @@ function getPerformance () {
|
||||
)
|
||||
const nav = performance?.getEntriesByType('navigation')?.[0]
|
||||
return {
|
||||
firstContentfulPaint: paint ? paint / 1000 : undefined,
|
||||
firstContentfulPaint: paint ? paint.startTime / 1000 : undefined,
|
||||
domInteractive: nav ? nav.domInteractive / 1000 : undefined,
|
||||
domComplete: nav ? nav.domComplete / 1000 : undefined,
|
||||
render: nav ? (nav.responseEnd - nav.requestStart) / 1000 : undefined
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
// Linkinator treats the following as regex.
|
||||
module.exports = [
|
||||
// Skip GitHub search links.
|
||||
'https://github.com/search?.*',
|
||||
'https://github.com/github/gitignore/search?',
|
||||
'https://github.com/search\\?',
|
||||
'https://github.com/github/gitignore/search\\?',
|
||||
|
||||
// These links require auth.
|
||||
'https://github.com/settings/profile',
|
||||
@@ -15,6 +15,6 @@ module.exports = [
|
||||
|
||||
// Oneoff links that link checkers think are broken but are not.
|
||||
'https://haveibeenpwned.com/',
|
||||
'https://www.ilo.org/dyn/normlex/en/f?p=NORMLEXPUB:12100:0::NO::P12100_ILO_CODE:P029',
|
||||
'https://www.ilo.org/dyn/normlex/en/f\\?p=NORMLEXPUB:12100:0::NO::P12100_ILO_CODE:P029',
|
||||
'http://www.w3.org/wiki/LinkHeader/'
|
||||
]
|
||||
|
||||
@@ -28545,6 +28545,14 @@
|
||||
"kind": "scalars",
|
||||
"href": "/graphql/reference/scalars#boolean"
|
||||
},
|
||||
{
|
||||
"name": "interactionAbility",
|
||||
"description": "<p>The interaction ability settings for this organization.</p>",
|
||||
"type": "RepositoryInteractionAbility",
|
||||
"id": "repositoryinteractionability",
|
||||
"kind": "objects",
|
||||
"href": "/graphql/reference/objects#repositoryinteractionability"
|
||||
},
|
||||
{
|
||||
"name": "ipAllowListEnabledSetting",
|
||||
"description": "<p>The setting value for whether the organization has an IP allow list enabled.</p>",
|
||||
@@ -42674,6 +42682,14 @@
|
||||
"kind": "scalars",
|
||||
"href": "/graphql/reference/scalars#uri"
|
||||
},
|
||||
{
|
||||
"name": "interactionAbility",
|
||||
"description": "<p>The interaction ability settings for this repository.</p>",
|
||||
"type": "RepositoryInteractionAbility",
|
||||
"id": "repositoryinteractionability",
|
||||
"kind": "objects",
|
||||
"href": "/graphql/reference/objects#repositoryinteractionability"
|
||||
},
|
||||
{
|
||||
"name": "isArchived",
|
||||
"description": "<p>Indicates if the repository is unmaintained.</p>",
|
||||
@@ -44553,6 +44569,39 @@
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "RepositoryInteractionAbility",
|
||||
"kind": "objects",
|
||||
"id": "repositoryinteractionability",
|
||||
"href": "/graphql/reference/objects#repositoryinteractionability",
|
||||
"description": "<p>Repository interaction limit that applies to this object.</p>",
|
||||
"fields": [
|
||||
{
|
||||
"name": "expiresAt",
|
||||
"description": "<p>The time the currently active limit expires.</p>",
|
||||
"type": "DateTime",
|
||||
"id": "datetime",
|
||||
"kind": "scalars",
|
||||
"href": "/graphql/reference/scalars#datetime"
|
||||
},
|
||||
{
|
||||
"name": "limit",
|
||||
"description": "<p>The current limit that is enabled on this object.</p>",
|
||||
"type": "RepositoryInteractionLimit!",
|
||||
"id": "repositoryinteractionlimit",
|
||||
"kind": "enums",
|
||||
"href": "/graphql/reference/enums#repositoryinteractionlimit"
|
||||
},
|
||||
{
|
||||
"name": "origin",
|
||||
"description": "<p>The origin of the currently active interaction limit.</p>",
|
||||
"type": "RepositoryInteractionLimitOrigin!",
|
||||
"id": "repositoryinteractionlimitorigin",
|
||||
"kind": "enums",
|
||||
"href": "/graphql/reference/enums#repositoryinteractionlimitorigin"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "RepositoryInvitation",
|
||||
"kind": "objects",
|
||||
@@ -52395,6 +52444,14 @@
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "interactionAbility",
|
||||
"description": "<p>The interaction ability settings for this user.</p>",
|
||||
"type": "RepositoryInteractionAbility",
|
||||
"id": "repositoryinteractionability",
|
||||
"kind": "objects",
|
||||
"href": "/graphql/reference/objects#repositoryinteractionability"
|
||||
},
|
||||
{
|
||||
"name": "isBountyHunter",
|
||||
"description": "<p>Whether or not this user is a participant in the GitHub Security Bug Bounty.</p>",
|
||||
@@ -59457,6 +59514,52 @@
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "RepositoryInteractionLimit",
|
||||
"kind": "enums",
|
||||
"id": "repositoryinteractionlimit",
|
||||
"href": "/graphql/reference/enums#repositoryinteractionlimit",
|
||||
"description": "<p>A repository interaction limit.</p>",
|
||||
"values": [
|
||||
{
|
||||
"name": "COLLABORATORS_ONLY",
|
||||
"description": "<p>Users that are not collaborators will not be able to interact with the repository.</p>"
|
||||
},
|
||||
{
|
||||
"name": "CONTRIBUTORS_ONLY",
|
||||
"description": "<p>Users that have not previously committed to a repository’s default branch will be unable to interact with the repository.</p>"
|
||||
},
|
||||
{
|
||||
"name": "EXISTING_USERS",
|
||||
"description": "<p>Users that have recently created their account will be unable to interact with the repository.</p>"
|
||||
},
|
||||
{
|
||||
"name": "NO_LIMIT",
|
||||
"description": "<p>No interaction limits are enabled.</p>"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "RepositoryInteractionLimitOrigin",
|
||||
"kind": "enums",
|
||||
"id": "repositoryinteractionlimitorigin",
|
||||
"href": "/graphql/reference/enums#repositoryinteractionlimitorigin",
|
||||
"description": "<p>Indicates where an interaction limit is configured.</p>",
|
||||
"values": [
|
||||
{
|
||||
"name": "ORGANIZATION",
|
||||
"description": "<p>A limit that is configured at the organization level.</p>"
|
||||
},
|
||||
{
|
||||
"name": "REPOSITORY",
|
||||
"description": "<p>A limit that is configured at the repository level.</p>"
|
||||
},
|
||||
{
|
||||
"name": "USER",
|
||||
"description": "<p>A limit that is configured at the user-wide level.</p>"
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "RepositoryInvitationOrderField",
|
||||
"kind": "enums",
|
||||
|
||||
@@ -123,6 +123,12 @@ class Page {
|
||||
|
||||
async _render (context) {
|
||||
this.intro = await renderContent(this.rawIntro, context)
|
||||
|
||||
// rewrite local links in the intro to include current language code and GHE version if needed
|
||||
const introHtml = cheerio.load(this.intro)
|
||||
rewriteLocalLinks(introHtml, context.currentVersion, context.currentLanguage)
|
||||
this.intro = introHtml('body').html()
|
||||
|
||||
this.introPlainText = await renderContent(this.rawIntro, context, { textOnly: true })
|
||||
this.title = await renderContent(this.rawTitle, context, { textOnly: true, encodeEntities: true })
|
||||
this.shortTitle = await renderContent(this.shortTitle, context, { textOnly: true, encodeEntities: true })
|
||||
|
||||
@@ -1,12 +1,20 @@
|
||||
const webpack = require('webpack')
|
||||
const middleware = require('webpack-dev-middleware')
|
||||
const config = require('../webpack.config')
|
||||
const FriendlyErrorsWebpackPlugin = require('friendly-errors-webpack-plugin')
|
||||
|
||||
const webpackCompiler = webpack({
|
||||
...config,
|
||||
mode: 'development'
|
||||
mode: 'development',
|
||||
plugins: [
|
||||
...config.plugins,
|
||||
new FriendlyErrorsWebpackPlugin({
|
||||
clearConsole: false
|
||||
})
|
||||
]
|
||||
})
|
||||
|
||||
module.exports = middleware(webpackCompiler, {
|
||||
publicPath: config.output.publicPath
|
||||
publicPath: config.output.publicPath,
|
||||
logLevel: 'silent'
|
||||
})
|
||||
|
||||
74
package-lock.json
generated
@@ -6718,6 +6718,15 @@
|
||||
"is-arrayish": "^0.2.1"
|
||||
}
|
||||
},
|
||||
"error-stack-parser": {
|
||||
"version": "2.0.6",
|
||||
"resolved": "https://registry.npmjs.org/error-stack-parser/-/error-stack-parser-2.0.6.tgz",
|
||||
"integrity": "sha512-d51brTeqC+BHlwF0BhPtcYgF5nlzf9ZZ0ZIUQNZpc9ZB9qw5IJ2diTrBY9jlCJkTLITYPjmiX6OWCwH+fuyNgQ==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"stackframe": "^1.1.1"
|
||||
}
|
||||
},
|
||||
"es-abstract": {
|
||||
"version": "1.17.7",
|
||||
"resolved": "https://registry.npmjs.org/es-abstract/-/es-abstract-1.17.7.tgz",
|
||||
@@ -8492,6 +8501,65 @@
|
||||
"resolved": "https://registry.npmjs.org/fresh/-/fresh-0.5.2.tgz",
|
||||
"integrity": "sha1-PYyt2Q2XZWn6g1qx+OSyOhBWBac="
|
||||
},
|
||||
"friendly-errors-webpack-plugin": {
|
||||
"version": "1.7.0",
|
||||
"resolved": "https://registry.npmjs.org/friendly-errors-webpack-plugin/-/friendly-errors-webpack-plugin-1.7.0.tgz",
|
||||
"integrity": "sha512-K27M3VK30wVoOarP651zDmb93R9zF28usW4ocaK3mfQeIEI5BPht/EzZs5E8QLLwbLRJQMwscAjDxYPb1FuNiw==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"chalk": "^1.1.3",
|
||||
"error-stack-parser": "^2.0.0",
|
||||
"string-width": "^2.0.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"ansi-regex": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz",
|
||||
"integrity": "sha1-7QMXwyIGT3lGbAKWa922Bas32Zg=",
|
||||
"dev": true
|
||||
},
|
||||
"chalk": {
|
||||
"version": "1.1.3",
|
||||
"resolved": "https://registry.npmjs.org/chalk/-/chalk-1.1.3.tgz",
|
||||
"integrity": "sha1-qBFcVeSnAv5NFQq9OHKCKn4J/Jg=",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"ansi-styles": "^2.2.1",
|
||||
"escape-string-regexp": "^1.0.2",
|
||||
"has-ansi": "^2.0.0",
|
||||
"strip-ansi": "^3.0.0",
|
||||
"supports-color": "^2.0.0"
|
||||
}
|
||||
},
|
||||
"is-fullwidth-code-point": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-2.0.0.tgz",
|
||||
"integrity": "sha1-o7MKXE8ZkYMWeqq5O+764937ZU8=",
|
||||
"dev": true
|
||||
},
|
||||
"string-width": {
|
||||
"version": "2.1.1",
|
||||
"resolved": "https://registry.npmjs.org/string-width/-/string-width-2.1.1.tgz",
|
||||
"integrity": "sha512-nOqH59deCq9SRHlxq1Aw85Jnt4w6KvLKqWVik6oA9ZklXLNIOlqg4F2yrT1MVaTjAqvVwdfeZ7w7aCvJD7ugkw==",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"is-fullwidth-code-point": "^2.0.0",
|
||||
"strip-ansi": "^4.0.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"strip-ansi": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-4.0.0.tgz",
|
||||
"integrity": "sha1-qEeQIusaw2iocTibY1JixQXuNo8=",
|
||||
"dev": true,
|
||||
"requires": {
|
||||
"ansi-regex": "^3.0.0"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"from": {
|
||||
"version": "0.1.7",
|
||||
"resolved": "https://registry.npmjs.org/from/-/from-0.1.7.tgz",
|
||||
@@ -18695,6 +18763,12 @@
|
||||
"integrity": "sha512-MTX+MeG5U994cazkjd/9KNAapsHnibjMLnfXodlkXw76JEea0UiNzrqidzo1emMwk7w5Qhc9jd4Bn9TBb1MFwA==",
|
||||
"dev": true
|
||||
},
|
||||
"stackframe": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/stackframe/-/stackframe-1.2.0.tgz",
|
||||
"integrity": "sha512-GrdeshiRmS1YLMYgzF16olf2jJ/IzxXY9lhKOskuVziubpTYcYqyOwYeJKzQkwy7uN0fYSsbsC4RQaXf9LCrYA==",
|
||||
"dev": true
|
||||
},
|
||||
"start-server-and-test": {
|
||||
"version": "1.11.3",
|
||||
"resolved": "https://registry.npmjs.org/start-server-and-test/-/start-server-and-test-1.11.3.tgz",
|
||||
|
||||
@@ -98,6 +98,7 @@
|
||||
"eslint-plugin-node": "^11.1.0",
|
||||
"eslint-plugin-promise": "^4.2.1",
|
||||
"event-to-promise": "^0.8.0",
|
||||
"friendly-errors-webpack-plugin": "^1.7.0",
|
||||
"graphql": "^14.5.8",
|
||||
"heroku-client": "^3.1.0",
|
||||
"husky": "^4.2.1",
|
||||
|
||||
139
script/README.md
@@ -39,13 +39,6 @@ Usage: script/anonymize-branch.js <new-commit-message> [base-branch] Example: sc
|
||||
---
|
||||
|
||||
|
||||
### [`archive-enterprise-version.js`](archive-enterprise-version.js)
|
||||
|
||||
Run this script during the Enterprise deprecation process to download static copies of all pages for the oldest supported Enterprise version. See the Enterprise deprecation issue template for instructions.
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`backfill-missing-localizations.js`](backfill-missing-localizations.js)
|
||||
|
||||
This script copies any English files that are missing from the translations directory into the translations directory. We only need to run this if problems occur with Crowdin's automatic sync.
|
||||
@@ -64,11 +57,9 @@ The `ignore` array is for client-side or build-time stuff that doesn't get `requ
|
||||
---
|
||||
|
||||
|
||||
### [`check-external-links`](check-external-links)
|
||||
### [`check-english-links.js`](check-english-links.js)
|
||||
|
||||
The script is run once per day via a scheduled GitHub Action to check all links in the site. It automatically opens an issue if it finds broken links. To exclude a URL from the link check, add it to `lib/excluded-links.js`.
|
||||
|
||||
For checking internal links, see `script/check-internal-links`.
|
||||
This script runs once per day via a scheduled GitHub Action to check all links in English content, not including deprecated Enterprise Server content. It opens an issue if it finds broken links. To exclude a link, add it to `lib/excluded-links.js`.
|
||||
|
||||
---
|
||||
|
||||
@@ -80,18 +71,9 @@ This script is run automatically when you run the server locally. It checks whet
|
||||
---
|
||||
|
||||
|
||||
### [`check-internal-links`](check-internal-links)
|
||||
|
||||
This script wraps tests/links-and-images.js and provides an option to output results to a file.
|
||||
|
||||
For more information, see `tests/README.md#broken-link-test`.
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`check-s3-images.js`](check-s3-images.js)
|
||||
|
||||
Run this script in your branch to check whether any images referenced in Enterprise content are not in the expected S3 bucket. You will need to authenticate to S3 via `awssume` to use this script. Instructions for the one-time setup are [here](https://github.com/github/product-documentation/blob/master/doc-team-workflows/workflow-information-for-all-writers/setting-up-awssume-and-s3cmd.md).
|
||||
Run this script in your branch to check whether any images referenced in content are not in an expected S3 bucket. You will need to authenticate to S3 via `awssume` to use this script. Instructions for the one-time setup are [here](https://github.com/github/product-documentation/blob/master/doc-team-workflows/workflow-information-for-all-writers/setting-up-awssume-and-s3cmd.md).
|
||||
|
||||
---
|
||||
|
||||
@@ -114,6 +96,13 @@ Run this script in your branch to check whether any images referenced in Enterpr
|
||||
|
||||
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`content-migrations/update-developer-site-links.js`](content-migrations/update-developer-site-links.js)
|
||||
|
||||
|
||||
|
||||
---
|
||||
|
||||
|
||||
@@ -131,9 +120,37 @@ This script finds and lists all the Heroku staging apps and deletes any leftover
|
||||
---
|
||||
|
||||
|
||||
### [`get-blc-command.js`](get-blc-command.js)
|
||||
### [`enterprise-server-deprecations/archive-version.js`](enterprise-server-deprecations/archive-version.js)
|
||||
|
||||
This script parses options for `script/check-external-links`.
|
||||
Run this script during the Enterprise deprecation process to download static copies of all pages for the oldest supported Enterprise version. See the Enterprise deprecation issue template for instructions.
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`enterprise-server-deprecations/remove-version-markup.js`](enterprise-server-deprecations/remove-version-markup.js)
|
||||
|
||||
Run this script after an Enterprise deprecation to remove Liquid statements and frontmatter that contain the deprecated Enterprise version. See the Enterprise deprecation issue template for instructions.
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`enterprise-server-releases/create-webhooks-for-new-version.js`](enterprise-server-releases/create-webhooks-for-new-version.js)
|
||||
|
||||
This script creates new static webhook payload files for a new version.
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`enterprise-server-releases/ghes-to-ghae-versioning.js`](enterprise-server-releases/ghes-to-ghae-versioning.js)
|
||||
|
||||
Run this script to add versions frontmatter and Liquid conditionals for GitHub AE, based on anything currently versioned for the provided release of Enterprise Server. This script should be run as part of the Enterprise Server release process.
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`enterprise-server-releases/release-banner.js`](enterprise-server-releases/release-banner.js)
|
||||
|
||||
This script creates or removes a release candidate banner for a specified version.
|
||||
|
||||
---
|
||||
|
||||
@@ -242,63 +259,6 @@ This script moves reusables out of YAML files into individual Markdown files.
|
||||
---
|
||||
|
||||
|
||||
### [`new-versioning/fixtures.js`](new-versioning/fixtures.js)
|
||||
|
||||
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`new-versioning/main`](new-versioning/main)
|
||||
|
||||
All the new versioning!
|
||||
|
||||
Usage $ script/new-versioning/main
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`new-versioning/move-admin-dir.js`](new-versioning/move-admin-dir.js)
|
||||
|
||||
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`new-versioning/update-content.js`](new-versioning/update-content.js)
|
||||
|
||||
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`new-versioning/update-frontmatter.js`](new-versioning/update-frontmatter.js)
|
||||
|
||||
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`new-versioning/update-not-fpt-conditionals.js`](new-versioning/update-not-fpt-conditionals.js)
|
||||
|
||||
Run this script to update these Liquid conditionals:
|
||||
|
||||
{% if currentVersion != 'free-pro-team@latest' %}
|
||||
|
||||
to:
|
||||
|
||||
{% if enterpriseServerVersions contains currentVersion %}
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`new-versioning/update-products-yml.js`](new-versioning/update-products-yml.js)
|
||||
|
||||
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`pages-with-liquid-titles.js`](pages-with-liquid-titles.js)
|
||||
|
||||
This is a temporary script to visualize which pages have liquid (and conditionals) in their `title` frontmatter
|
||||
@@ -368,20 +328,6 @@ An automated test checks for discrepancies between filenames and [autogenerated
|
||||
---
|
||||
|
||||
|
||||
### [`release-banner.js`](release-banner.js)
|
||||
|
||||
This script creates or removes a release candidate banner for a specified version.
|
||||
|
||||
|
||||
---
|
||||
|
||||
### [`remove-deprecated-enterprise-version-markup.js`](remove-deprecated-enterprise-version-markup.js)
|
||||
|
||||
Run this script after an Enterprise deprecation to remove Liquid statements and frontmatter that contain the deprecated Enterprise version. See the Enterprise deprecation issue template for instructions.
|
||||
|
||||
---
|
||||
|
||||
|
||||
### [`remove-extraneous-translation-files.js`](remove-extraneous-translation-files.js)
|
||||
|
||||
An [automated test](/tests/extraneous-translation-files.js) checks for files in the `translations/` directory that do not have an equivalent English file in the `content/` directory, and fails if it finds extraneous files. When the test fails, a human needs to run this script to remove the files.
|
||||
@@ -488,9 +434,10 @@ This script is used by other scripts to update temporary AWS credentials and aut
|
||||
---
|
||||
|
||||
|
||||
### [`upload-enterprise-images-to-s3.js`](upload-enterprise-images-to-s3.js)
|
||||
### [`upload-images-to-s3.js`](upload-images-to-s3.js)
|
||||
|
||||
Run this script to: [upload individual files to S3](https://github.com/github/product-documentation/blob/master/doc-team-workflows/workflow-information-for-all-writers/adding-individual-images-to-earlier-verisons-of-enterprise.md) or: [upload a batch of files to S3 for a new Enterprise release](https://github.com/github/product-documentation/blob/master/doc-team-workflows/working-on-enterprise-releases/information-for-all-writers/storing-a-batch-of-assets-on-s3-for-a-new-release.md). Run `upload-enterprise-images-to-s3.js --help` for usage details.
|
||||
Use this script to upload individual or batched asset files to a versioned S3 bucket. Run `upload-images-to-s3.js --help` for usage details.
|
||||
|
||||
---
|
||||
|
||||
|
||||
|
||||
@@ -3,31 +3,35 @@
|
||||
const path = require('path')
|
||||
const fs = require('fs')
|
||||
const linkinator = require('linkinator')
|
||||
const dedent = require('dedent')
|
||||
const program = require('commander')
|
||||
const { escapeRegExp } = require('lodash')
|
||||
const { pull, uniq } = require('lodash')
|
||||
const checker = new linkinator.LinkChecker()
|
||||
const rimraf = require('rimraf').sync
|
||||
const mkdirp = require('mkdirp').sync
|
||||
const root = 'https://docs.github.com'
|
||||
const englishRoot = `${root}/en`
|
||||
const { deprecated } = require('../lib/enterprise-server-releases')
|
||||
const got = require('got')
|
||||
|
||||
// Links with these codes may or may not really be broken.
|
||||
const retryStatusCodes = [429, 503]
|
||||
|
||||
// [start-readme]
|
||||
//
|
||||
// This script runs once per day via a scheduled GitHub Action to check all links in
|
||||
// English content, not including deprecated Enterprise Server content. It opens an issue
|
||||
// if it finds broken links. To exclude a link, add it to `lib/excluded-links.js`.
|
||||
// if it finds broken links. To exclude a link path, add it to `lib/excluded-links.js`.
|
||||
//
|
||||
// [end-readme]
|
||||
|
||||
program
|
||||
.description('Check all links in the English docs.')
|
||||
.option('-d, --dry-run', 'Turn off recursion to get a fast minimal report (useful for previewing output).')
|
||||
.option('-p, --path <PATH>', 'Provide an optional path to check. Best used with --dry-run. If not provided, defaults to the homepage.')
|
||||
.parse(process.argv)
|
||||
|
||||
// Skip excluded links defined in separate file.
|
||||
const excludedLinks = require('../lib/excluded-links')
|
||||
.map(link => escapeRegExp(link))
|
||||
|
||||
// Skip non-English content.
|
||||
const languagesToSkip = Object.keys(require('../lib/languages'))
|
||||
@@ -40,7 +44,7 @@ const languagesToSkip = Object.keys(require('../lib/languages'))
|
||||
const enterpriseReleasesToSkip = new RegExp(`${root}.+?[/@](${deprecated.join('|')})/`)
|
||||
|
||||
const config = {
|
||||
path: englishRoot,
|
||||
path: program.path || englishRoot,
|
||||
concurrency: 300,
|
||||
// If this is a dry run, turn off recursion.
|
||||
recurse: !program.dryRun,
|
||||
@@ -56,12 +60,10 @@ const config = {
|
||||
main()
|
||||
|
||||
async function main () {
|
||||
const startTime = new Date()
|
||||
|
||||
// Clear and recreate a directory for logs.
|
||||
const logFile = path.join(__dirname, '../.linkinator/full.log')
|
||||
rimraf(path.dirname(logFile))
|
||||
fs.mkdirSync(path.dirname(logFile), { recursive: true })
|
||||
mkdirp(path.dirname(logFile))
|
||||
|
||||
// Update CLI output and append to logfile after each checked link.
|
||||
checker.on('link', result => {
|
||||
@@ -69,27 +71,63 @@ async function main () {
|
||||
})
|
||||
|
||||
// Start the scan; events will be logged as they occur.
|
||||
const result = await checker.check(config)
|
||||
const result = (await checker.check(config)).links
|
||||
|
||||
// Scan is complete! Display the results.
|
||||
const endTime = new Date()
|
||||
const skippedLinks = result.links.filter(x => x.state === 'SKIPPED')
|
||||
const brokenLinks = result.links.filter(x => x.state === 'BROKEN')
|
||||
// Scan is complete! Filter the results for broken links.
|
||||
const brokenLinks = result
|
||||
.filter(link => link.state === 'BROKEN')
|
||||
|
||||
console.log(dedent`
|
||||
${brokenLinks.length} broken links found on docs.github.com
|
||||
// Links to retry individually.
|
||||
const linksToRetry = brokenLinks
|
||||
.filter(link => !link.status || retryStatusCodes.includes(link.status))
|
||||
|
||||
Link scan completed in ${endTime - startTime}ms
|
||||
Total links: ${result.links.length}
|
||||
Skipped links: ${skippedLinks.length}
|
||||
Broken links: ${brokenLinks.length}
|
||||
For more details see ${path.relative(process.cwd(), logFile)}
|
||||
`)
|
||||
await Promise.all(linksToRetry
|
||||
.map(async (link) => {
|
||||
try {
|
||||
// got throws an HTTPError if response code is not 2xx or 3xx.
|
||||
// If got succeeds, we can remove the link from the list.
|
||||
await got(link.url)
|
||||
pull(brokenLinks, link)
|
||||
// If got fails, do nothing. The link is already in the broken list.
|
||||
} catch (err) {
|
||||
// noop
|
||||
}
|
||||
}))
|
||||
|
||||
if (brokenLinks.length) {
|
||||
console.log('\n\n' + JSON.stringify(brokenLinks, null, 2))
|
||||
// Exit successfully if no broken links!
|
||||
if (!brokenLinks.length) {
|
||||
console.log('All links are good!')
|
||||
process.exit(0)
|
||||
}
|
||||
|
||||
// Format and display the results.
|
||||
console.log(`${brokenLinks.length} broken links found on docs.github.com\n`)
|
||||
displayBrokenLinks(brokenLinks)
|
||||
|
||||
// Exit unsuccessfully if broken links are found.
|
||||
process.exit(1)
|
||||
}
|
||||
|
||||
process.exit(0)
|
||||
function displayBrokenLinks (brokenLinks) {
|
||||
// Sort results by status code.
|
||||
const allStatusCodes = uniq(brokenLinks
|
||||
// Coerce undefined status codes into `Invalid` strings so we can display them.
|
||||
// Without this, undefined codes get JSON.stringified as `0`, which is not useful output.
|
||||
.map(link => {
|
||||
if (!link.status) link.status = 'Invalid'
|
||||
return link
|
||||
})
|
||||
.map(link => link.status)
|
||||
)
|
||||
|
||||
allStatusCodes.forEach(statusCode => {
|
||||
const brokenLinksForStatus = brokenLinks.filter(x => x.status === statusCode)
|
||||
|
||||
console.log(`## Status ${statusCode}: Found ${brokenLinksForStatus.length} broken links`)
|
||||
console.log('```')
|
||||
brokenLinksForStatus.forEach(brokenLinkObj => {
|
||||
console.log(JSON.stringify(brokenLinkObj, null, 2))
|
||||
})
|
||||
console.log('```')
|
||||
})
|
||||
}
|
||||
|
||||
@@ -10,10 +10,10 @@ const scrape = require('website-scraper')
|
||||
const program = require('commander')
|
||||
const rimraf = require('rimraf').sync
|
||||
const mkdirp = require('mkdirp').sync
|
||||
const version = require('../lib/enterprise-server-releases').oldestSupported
|
||||
const version = require('../../lib/enterprise-server-releases').oldestSupported
|
||||
const archivalRepoName = 'help-docs-archived-enterprise-versions'
|
||||
const archivalRepoUrl = `https://github.com/github/${archivalRepoName}`
|
||||
const loadRedirects = require('../lib/redirects/precompile')
|
||||
const loadRedirects = require('../../lib/redirects/precompile')
|
||||
|
||||
// [start-readme]
|
||||
//
|
||||
@@ -95,7 +95,7 @@ async function main () {
|
||||
}
|
||||
|
||||
console.log(`Enterprise version to archive: ${version}`)
|
||||
const pages = await (require('../lib/pages')())
|
||||
const pages = await (require('../../lib/pages')())
|
||||
const permalinksPerVersion = Object.keys(pages)
|
||||
.filter(key => key.includes(`/enterprise-server@${version}`))
|
||||
|
||||
@@ -6,11 +6,11 @@ const walk = require('walk-sync')
|
||||
const matter = require('gray-matter')
|
||||
const program = require('commander')
|
||||
const { indexOf, nth } = require('lodash')
|
||||
const removeLiquidStatements = require('../lib/remove-liquid-statements')
|
||||
const removeDeprecatedFrontmatter = require('../lib/remove-deprecated-frontmatter')
|
||||
const enterpriseServerReleases = require('../lib/enterprise-server-releases')
|
||||
const contentPath = path.join(__dirname, '../content')
|
||||
const dataPath = path.join(__dirname, '../data')
|
||||
const removeLiquidStatements = require('../../lib/remove-liquid-statements')
|
||||
const removeDeprecatedFrontmatter = require('../../lib/remove-deprecated-frontmatter')
|
||||
const enterpriseServerReleases = require('../../lib/enterprise-server-releases')
|
||||
const contentPath = path.join(__dirname, '../../content')
|
||||
const dataPath = path.join(__dirname, '../../data')
|
||||
const removeUnusedAssetsScript = 'script/remove-unused-assets'
|
||||
const elseifRegex = /{-?% elsif/
|
||||
|
||||
@@ -62,7 +62,7 @@ const allFiles = contentFiles.concat(dataFiles)
|
||||
|
||||
main()
|
||||
console.log(`\nRunning ${removeUnusedAssetsScript}...`)
|
||||
require(`../${removeUnusedAssetsScript}`)
|
||||
require(path.join(process.cwd(), removeUnusedAssetsScript))
|
||||
|
||||
function printElseIfFoundWarning (location) {
|
||||
console.log(`${location} has an 'elsif' condition! Resolve all elsifs by hand, then rerun the script.`)
|
||||
@@ -4,7 +4,7 @@ const fs = require('fs')
|
||||
const mkdirp = require('mkdirp').sync
|
||||
const path = require('path')
|
||||
const program = require('commander')
|
||||
const allVersions = require('../lib/all-versions')
|
||||
const allVersions = require('../../lib/all-versions')
|
||||
|
||||
// [start-readme]
|
||||
//
|
||||
@@ -4,7 +4,7 @@ const fs = require('fs')
|
||||
const path = require('path')
|
||||
const program = require('commander')
|
||||
const yaml = require('js-yaml')
|
||||
const allVersions = require('../lib/all-versions')
|
||||
const allVersions = require('../../lib/all-versions')
|
||||
const releaseCandidateFile = 'data/variables/release_candidate.yml'
|
||||
const releaseCandidateYaml = path.join(process.cwd(), releaseCandidateFile)
|
||||
|
||||
@@ -78,6 +78,9 @@ async function main () {
|
||||
updateStaticFile(previewsJson, path.join(graphqlStaticDir, 'previews.json'))
|
||||
updateStaticFile(upcomingChangesJson, path.join(graphqlStaticDir, 'upcoming-changes.json'))
|
||||
updateStaticFile(prerenderedObjects, path.join(graphqlStaticDir, 'prerendered-objects.json'))
|
||||
|
||||
// Ensure the YAML linter runs before checkinging in files
|
||||
execSync('npx prettier -w "**/*.{yml,yaml}"')
|
||||
}
|
||||
|
||||
// get latest from github/github
|
||||
|
||||
@@ -1,188 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
const walk = require('walk-sync')
|
||||
const program = require('commander')
|
||||
const { uniq, dropRight } = require('lodash')
|
||||
const frontmatter = require('@github-docs/frontmatter')
|
||||
const contentPath = path.join(process.cwd(), 'content')
|
||||
const dataPath = path.join(process.cwd(), 'data')
|
||||
const translationsPath = path.join(process.cwd(), 'translations')
|
||||
const { latest } = require('../../lib/enterprise-server-releases')
|
||||
const { getEnterpriseServerNumber } = require('../../lib/patterns')
|
||||
const versionSatisfiesRange = require('../../lib/version-satisfies-range')
|
||||
const getApplicableVersions = require('../../lib/get-applicable-versions')
|
||||
const getDataReferences = require('../../lib/get-liquid-data-references')
|
||||
|
||||
// [start-readme]
|
||||
//
|
||||
// Run this script to add versions frontmatter and Liquid conditionals for
|
||||
// Enterprise AE, based on anything currently versioned for the latest release
|
||||
// of Enterprise Server. This script should be run as part of the Enterprise
|
||||
// Server release process.
|
||||
//
|
||||
// [end-readme]
|
||||
|
||||
program
|
||||
.description('Add versions frontmatter and Liquid conditionals for Enterprise AE based on the latest Enterprise Server release.')
|
||||
.option('-p, --product <PRODUCT_ID>', 'Product ID. Example: admin')
|
||||
.option('-t, --translations', 'Run the script on content and data in translations, too.')
|
||||
.parse(process.argv)
|
||||
|
||||
if (program.product) {
|
||||
console.log(`✅ Running on the ${program.product} product only`)
|
||||
} else {
|
||||
console.log('✅ Running on all products')
|
||||
}
|
||||
|
||||
if (program.translations) {
|
||||
console.log('✅ Running on both English and translated content and data\n')
|
||||
} else {
|
||||
console.log('✅ Running on English content and data\n')
|
||||
}
|
||||
|
||||
// The new conditional to add
|
||||
const enterpriseAEConditional = 'currentVersion == "github-ae@latest"'
|
||||
|
||||
// Match: currentVersion <operator> "enterprise-server@(\d+\.\d+)"
|
||||
const getEnterpriseServerConditional = new RegExp(`currentVersion (\\S+?) "${getEnterpriseServerNumber.source}"`)
|
||||
|
||||
console.log(`Adding versioning for Enterprise AE based on ${latest}!\n`)
|
||||
console.log('Working...\n')
|
||||
|
||||
const englishContentFiles = walkContent(contentPath)
|
||||
const englishDataFiles = walkData(dataPath, englishContentFiles)
|
||||
|
||||
function walkContent (dirPath) {
|
||||
if (program.product) {
|
||||
// Run on content/<product> only
|
||||
dirPath = path.join(contentPath, program.product)
|
||||
}
|
||||
return walk(dirPath, { includeBasePath: true, directories: false })
|
||||
.filter(file => file.includes('/content/'))
|
||||
.filter(file => file.endsWith('.md'))
|
||||
.filter(file => !file.endsWith('README.md'))
|
||||
}
|
||||
|
||||
function walkData (dirPath, contentFiles) {
|
||||
if (program.product) {
|
||||
const dataFilesPerProductInContent = getReferencedDataFiles(contentFiles)
|
||||
const dataFilesPerProductInData = getReferencedDataFiles(dataFilesPerProductInContent)
|
||||
const dataFilesPerProduct = dataFilesPerProductInContent.concat(dataFilesPerProductInData)
|
||||
return dataFilesPerProduct
|
||||
} else {
|
||||
return walk(dirPath, { includeBasePath: true, directories: false })
|
||||
.filter(file => file.includes('/data/reusables') || file.includes('/data/variables'))
|
||||
.filter(file => !file.endsWith('README.md'))
|
||||
}
|
||||
}
|
||||
|
||||
// Return an array of variable and reusable filenames referenced in a given set of files.
|
||||
function getReferencedDataFiles (files) {
|
||||
return uniq(files
|
||||
.map(file => getDataReferences(fs.readFileSync(file, 'utf8'))).flat()
|
||||
.map(dataRef => {
|
||||
dataRef = dataRef.replace('site.', '').replace(/\./g, '/')
|
||||
dataRef = dataRef.includes('variables') ? dropRight(dataRef.split('/')).join('/') : dataRef
|
||||
dataRef = dataRef.includes('variables') ? `${dataRef}.yml` : `${dataRef}.md`
|
||||
return path.join(process.cwd(), dataRef)
|
||||
}))
|
||||
}
|
||||
|
||||
let allContentFiles, allDataFiles
|
||||
if (program.translations) {
|
||||
const translatedContentFiles = walkContent(translationsPath)
|
||||
const translatedDataFiles = walkData(translationsPath, translatedContentFiles)
|
||||
allContentFiles = englishContentFiles.concat(translatedContentFiles)
|
||||
allDataFiles = englishDataFiles.concat(translatedDataFiles)
|
||||
} else {
|
||||
allContentFiles = englishContentFiles
|
||||
allDataFiles = englishDataFiles
|
||||
}
|
||||
|
||||
// Map Liquid operators to semver operators
|
||||
const operators = {
|
||||
ver_gt: '>',
|
||||
ver_lt: '<',
|
||||
'==': '='
|
||||
}
|
||||
|
||||
allDataFiles
|
||||
.forEach(file => {
|
||||
const dataContent = fs.readFileSync(file, 'utf8')
|
||||
|
||||
// Update Liquid in data files
|
||||
const newDataContent = updateLiquid(dataContent, file)
|
||||
|
||||
fs.writeFileSync(file, newDataContent)
|
||||
})
|
||||
|
||||
allContentFiles
|
||||
.forEach(file => {
|
||||
const { data, content } = frontmatter(fs.readFileSync(file, 'utf8'))
|
||||
|
||||
const applicableVersions = getApplicableVersions(data.versions, file)
|
||||
|
||||
// If the current page is not available in the latest version of GHES, nothing to do!
|
||||
if (!applicableVersions.includes(`enterprise-server@${latest}`)) return
|
||||
|
||||
// Add frontmatter version
|
||||
data.versions['github-ae'] = '*'
|
||||
|
||||
// Update Liquid in content files
|
||||
const newContent = updateLiquid(content, file)
|
||||
|
||||
// Update Liquid in frontmatter props
|
||||
Object.keys(data)
|
||||
.filter(key => typeof data[key] === 'string')
|
||||
.forEach(key => {
|
||||
data[key] = updateLiquid(data[key], file)
|
||||
})
|
||||
|
||||
fs.writeFileSync(file, frontmatter.stringify(newContent, data, { lineWidth: 10000 }))
|
||||
})
|
||||
|
||||
function updateLiquid (content, file) {
|
||||
const allConditionals = content.match(/{% if .+?%}/g)
|
||||
if (!allConditionals) return content
|
||||
|
||||
let newContent = content
|
||||
|
||||
allConditionals.forEach(conditional => {
|
||||
// Do not process a conditional that already includes github-ae
|
||||
if (conditional.includes('github-ae')) return
|
||||
|
||||
// Example match: currentVersion ver_gt "enterprise-server@2.21"
|
||||
const enterpriseServerMatch = conditional.match(getEnterpriseServerConditional)
|
||||
if (!enterpriseServerMatch) return
|
||||
|
||||
// Example liquid operator: ver_gt
|
||||
const liquidOperator = enterpriseServerMatch[1]
|
||||
|
||||
// Example semver operator: >
|
||||
const semverOperator = operators[liquidOperator]
|
||||
|
||||
// Example number: 2.21
|
||||
const number = enterpriseServerMatch[2]
|
||||
|
||||
// Example range: >2.21
|
||||
const range = `${semverOperator}${number}`
|
||||
|
||||
// Return early if the conditional does not apply to the latest GHES version;
|
||||
// that means it will not apply to GHPI either
|
||||
if (!versionSatisfiesRange(latest, range)) return
|
||||
|
||||
// First do the replacement within the conditional
|
||||
// Old: {% if currentVersion == "free-pro-team@latest" or currentVersion ver_gt "enterprise-server@2.21" %}
|
||||
// New: {% if currentVersion == "free-pro-team@latest" or currentVersion ver_gt "enterprise-server@2.21" or currentVersion == "github-ae@latest" %}
|
||||
const newConditional = conditional.replace(enterpriseServerMatch[0], `${enterpriseServerMatch[0]} or ${enterpriseAEConditional}`)
|
||||
|
||||
// Then replace all instances of the conditional in the content
|
||||
newContent = newContent.replace(conditional, newConditional)
|
||||
})
|
||||
|
||||
return newContent
|
||||
}
|
||||
|
||||
console.log('Done!')
|
||||
@@ -1,58 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
const walk = require('walk-sync')
|
||||
const frontmatter = require('@github-docs/frontmatter')
|
||||
const fixturesPath = path.join(process.cwd(), 'tests/fixtures')
|
||||
|
||||
// NOTE this script does not run as part of script/new-versioning/main!
|
||||
// It was a one-time-use script that can be removed soon
|
||||
|
||||
const fixturesFiles = walk(fixturesPath, { includeBasePath: true, directories: false })
|
||||
.filter(file => file.endsWith('.md'))
|
||||
.filter(file => !file.includes('tests/fixtures/remove-liquid-statements'))
|
||||
|
||||
fixturesFiles
|
||||
.forEach(file => {
|
||||
const { data, content } = frontmatter(fs.readFileSync(file, 'utf8'))
|
||||
|
||||
// Update Liquid in content
|
||||
const newContent = content ? updateLiquid(content) : ''
|
||||
|
||||
// Update versions frontmatter
|
||||
if (data) {
|
||||
if (!data.versions && data.productVersions) {
|
||||
data.versions = data.productVersions
|
||||
Object.keys(data.versions).forEach(version => {
|
||||
// update dotcom, actions, rest, etc.
|
||||
if (version !== 'enterprise') {
|
||||
data.versions['free-pro-team'] = data.versions[version]
|
||||
delete data.versions[version]
|
||||
} else {
|
||||
data.versions['enterprise-server'] = data.versions.enterprise
|
||||
delete data.versions.enterprise
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
delete data.productVersions
|
||||
|
||||
// Update Liquid in frontmatter props
|
||||
Object.keys(data)
|
||||
// Only process a subset of props
|
||||
.filter(key => key === 'title' || key === 'intro' || key === 'product')
|
||||
.forEach(key => {
|
||||
data[key] = updateLiquid(data[key])
|
||||
})
|
||||
}
|
||||
|
||||
fs.writeFileSync(file, frontmatter.stringify(newContent, data, { lineWidth: 10000 }))
|
||||
})
|
||||
|
||||
function updateLiquid (content) {
|
||||
return content
|
||||
.replace(/page.version/g, 'currentVersion')
|
||||
.replace(/["'](?:')?dotcom["'](?:')?/g, '"free-pro-team@latest"')
|
||||
.replace(/["'](?:')?(2\.\d{2})["'](?:')?/g, '"enterprise-server@$1"')
|
||||
}
|
||||
@@ -1,32 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# [start-readme]
|
||||
#
|
||||
# All the new versioning!
|
||||
#
|
||||
# Usage
|
||||
# $ script/new-versioning/main
|
||||
#
|
||||
# [end-readme]
|
||||
|
||||
base="script/new-versioning/"
|
||||
|
||||
scripts=(
|
||||
"move-admin-dir.js"
|
||||
"update-frontmatter.js"
|
||||
"update-content.js"
|
||||
"update-products-yml.js"
|
||||
)
|
||||
|
||||
for script in "${scripts[@]}"
|
||||
do
|
||||
fullPath="${base}${script}"
|
||||
printf "\n"
|
||||
echo "⭐ $script"
|
||||
"${fullPath}"
|
||||
|
||||
echo "${script} is done!"
|
||||
printf "\n\n"
|
||||
done
|
||||
|
||||
echo "done with all scripts!"
|
||||
@@ -1,64 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
const { execSync } = require('child_process')
|
||||
const rimraf = require('rimraf').sync
|
||||
const englishContentDir = 'content'
|
||||
const walk = require('walk-sync')
|
||||
const frontmatter = require('@github-docs/frontmatter')
|
||||
const addRedirect = require('../../lib/redirects/add-redirect-to-frontmatter')
|
||||
|
||||
execSync(`mv ${englishContentDir}/enterprise/admin/ ${englishContentDir}/admin`)
|
||||
rimraf(`${englishContentDir}/enterprise`)
|
||||
|
||||
fs.readdirSync('translations')
|
||||
.filter(file => !file.endsWith('.md'))
|
||||
.forEach(dir => {
|
||||
const translatedContentDir = path.join('translations', dir, 'content')
|
||||
execSync(`mv ${translatedContentDir}/enterprise/admin/ ${translatedContentDir}/admin`)
|
||||
rimraf(`${translatedContentDir}/enterprise`)
|
||||
})
|
||||
|
||||
const adminDir = path.join(process.cwd(), englishContentDir, 'admin')
|
||||
|
||||
// Add redirects to English
|
||||
walk(adminDir, { includeBasePath: true, directories: false })
|
||||
.filter(file => file.endsWith('.md'))
|
||||
.forEach(file => {
|
||||
const contents = fs.readFileSync(file, 'utf8')
|
||||
const { data, content } = frontmatter(contents)
|
||||
|
||||
const oldPath = file
|
||||
.replace(adminDir, '/enterprise/admin')
|
||||
.replace('.md', '')
|
||||
.replace('/index', '')
|
||||
|
||||
data.redirect_from = addRedirect(data.redirect_from, oldPath)
|
||||
|
||||
fs.writeFileSync(file, frontmatter.stringify(content, data, { lineWidth: 10000 }))
|
||||
})
|
||||
|
||||
// Add redirects to translations
|
||||
const translationDirs = fs.readdirSync('translations')
|
||||
.filter(file => !file.endsWith('.md'))
|
||||
.map(dir => path.join('translations', dir, 'content/admin'))
|
||||
|
||||
translationDirs
|
||||
.forEach(translationDir => {
|
||||
walk(translationDir, { includeBasePath: true, directories: false })
|
||||
.filter(file => file.endsWith('.md'))
|
||||
.forEach(file => {
|
||||
const contents = fs.readFileSync(file, 'utf8')
|
||||
const { data, content } = frontmatter(contents)
|
||||
|
||||
const oldPath = file
|
||||
.replace(translationDir, '/enterprise/admin')
|
||||
.replace('.md', '')
|
||||
.replace('/index', '')
|
||||
|
||||
data.redirect_from = addRedirect(data.redirect_from, oldPath)
|
||||
|
||||
fs.writeFileSync(file, frontmatter.stringify(content, data, { lineWidth: 10000 }))
|
||||
})
|
||||
})
|
||||
@@ -1,36 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
const walk = require('walk-sync')
|
||||
const { flatten } = require('lodash')
|
||||
const dirsToProcess = ['content', 'data', 'translations']
|
||||
const allFiles = flatten(dirsToProcess.map(dir => {
|
||||
return walk(path.join(process.cwd(), dir), { includeBasePath: true, directories: false })
|
||||
.filter(file => !file.endsWith('README.md'))
|
||||
}))
|
||||
|
||||
allFiles.forEach(file => {
|
||||
let newContents = fs.readFileSync(file, 'utf8')
|
||||
.replace(/page.version/g, 'currentVersion')
|
||||
.replace(/["'](?:')?dotcom["'](?:')?/g, '"free-pro-team@latest"')
|
||||
.replace(/["'](?:')?(2\.\d{2})["'](?:')?/g, '"enterprise-server@$1"')
|
||||
// TODO handle this separately? requires a change in lib/rewrite-local-links.js
|
||||
// .replace(/class="dotcom-only"/g, 'class="do-not-version"')
|
||||
|
||||
// replace this one weird subtitle
|
||||
if (file.endsWith('content/github/index.md')) {
|
||||
newContents = newContents.replace(`
|
||||
{% if currentVersion != "free-pro-team@latest" %}
|
||||
<h1 class="border-bottom-0">GitHub Enterprise Server {{ currentVersion }}</h1>
|
||||
{% endif %}
|
||||
`, '')
|
||||
}
|
||||
|
||||
// update this one weird link
|
||||
if (file.endsWith('content/graphql/overview/public-schema.md')) {
|
||||
newContents = newContents.replace('(GitHub Enterprise {{ currentVersion }})', '({{ allVersions[currentVersion].versionTitle }})')
|
||||
}
|
||||
|
||||
fs.writeFileSync(file, newContents)
|
||||
})
|
||||
@@ -1,72 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
const frontmatter = require('@github-docs/frontmatter')
|
||||
const { flatten } = require('lodash')
|
||||
const patterns = require('../../lib/patterns')
|
||||
const walk = require('walk-sync')
|
||||
const dirsToProcess = ['content', 'translations']
|
||||
const allFiles = flatten(dirsToProcess.map(dir => {
|
||||
return walk(path.join(process.cwd(), dir), { includeBasePath: true, directories: false })
|
||||
.filter(file => !file.endsWith('README.md'))
|
||||
.filter(file => !(file.endsWith('LICENSE') || file.endsWith('LICENSE-CODE')))
|
||||
// we only want to process frontmatter in content files in translations, so skip data files
|
||||
// this is very brittle but works well enough for this script
|
||||
// (note data files are updated in script/new-versioning/update-content.js)
|
||||
.filter(file => !file.includes('/data/'))
|
||||
}))
|
||||
|
||||
allFiles.forEach(file => {
|
||||
const contents = fs.readFileSync(file, 'utf8')
|
||||
const { data, content } = frontmatter(contents)
|
||||
|
||||
if (!data.versions) {
|
||||
data.versions = data.productVersions
|
||||
Object.keys(data.versions).forEach(version => {
|
||||
// process dotcom, actions, rest, etc.
|
||||
if (version !== 'enterprise') {
|
||||
data.versions['free-pro-team'] = data.versions[version]
|
||||
delete data.versions[version]
|
||||
} else {
|
||||
data.versions['enterprise-server'] = data.versions.enterprise
|
||||
// TODO we are not adding these WIP versions yet
|
||||
// we can run a modified version of this script later to add them
|
||||
// data.versions['enterprise-cloud'] = '*'
|
||||
// data.versions['private-instances'] = '*'
|
||||
delete data.versions.enterprise
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
// remove hardcoded version numbers in redirect frontmatter
|
||||
// fix for https://github.com/github/docs-internal/issues/10835
|
||||
if (data.redirect_from) {
|
||||
data.redirect_from = Array.from([data.redirect_from]).flat().filter(oldPath => {
|
||||
return !patterns.getEnterpriseVersionNumber.test(oldPath)
|
||||
})
|
||||
}
|
||||
|
||||
delete data.productVersions
|
||||
|
||||
// update some oneoff content files
|
||||
if (file.endsWith('content/index.md')) {
|
||||
data.versions['enterprise-server'] = '*'
|
||||
// TODO we are not adding these WIP versions yet
|
||||
// we can run a modified version of this script later to add them
|
||||
// data.versions['enterprise-cloud'] = '*'
|
||||
// data.versions['private-instances'] = '*'
|
||||
}
|
||||
|
||||
if (file.endsWith('content/github/index.md')) {
|
||||
data.title = 'GitHub.com'
|
||||
delete data.shortTitle
|
||||
}
|
||||
|
||||
if (file.endsWith('content/admin/index.md')) {
|
||||
data.title = 'Enterprise Administrators'
|
||||
delete data.shortTitle
|
||||
}
|
||||
|
||||
fs.writeFileSync(file, frontmatter.stringify(content, data, { lineWidth: 10000 }))
|
||||
})
|
||||
@@ -1,77 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
const walk = require('walk-sync')
|
||||
const frontmatter = require('@github-docs/frontmatter')
|
||||
const contentPath = path.join(process.cwd(), 'content')
|
||||
const dataPath = path.join(process.cwd(), 'data')
|
||||
|
||||
// [start-readme]
|
||||
//
|
||||
// Run this script to update these Liquid conditionals:
|
||||
//
|
||||
// {% if currentVersion != 'free-pro-team@latest' %}
|
||||
//
|
||||
// to:
|
||||
//
|
||||
// {% if enterpriseServerVersions contains currentVersion %}
|
||||
//
|
||||
// [end-readme]
|
||||
|
||||
// The new conditional to add
|
||||
const newConditional = 'enterpriseServerVersions contains currentVersion'
|
||||
|
||||
// The old conditional to replace
|
||||
const oldConditional = /currentVersion != ["']free-pro-team@latest["']/g
|
||||
|
||||
console.log('Working...\n')
|
||||
|
||||
const englishContentFiles = walkContent(contentPath)
|
||||
const englishDataFiles = walkData(dataPath, englishContentFiles)
|
||||
|
||||
function walkContent (dirPath) {
|
||||
return walk(dirPath, { includeBasePath: true, directories: false })
|
||||
.filter(file => file.includes('/content/'))
|
||||
.filter(file => file.endsWith('.md'))
|
||||
.filter(file => !file.endsWith('README.md'))
|
||||
}
|
||||
|
||||
function walkData (dirPath, contentFiles) {
|
||||
return walk(dirPath, { includeBasePath: true, directories: false })
|
||||
.filter(file => file.includes('/data/reusables') || file.includes('/data/variables'))
|
||||
.filter(file => !file.endsWith('README.md'))
|
||||
}
|
||||
|
||||
englishDataFiles
|
||||
.forEach(file => {
|
||||
const dataContent = fs.readFileSync(file, 'utf8')
|
||||
|
||||
// Update Liquid in data files
|
||||
const newDataContent = updateLiquid(dataContent, file)
|
||||
|
||||
fs.writeFileSync(file, newDataContent)
|
||||
})
|
||||
|
||||
englishContentFiles
|
||||
.forEach(file => {
|
||||
const { data, content } = frontmatter(fs.readFileSync(file, 'utf8'))
|
||||
|
||||
// Update Liquid in content files
|
||||
const newContent = updateLiquid(content, file)
|
||||
|
||||
// Update Liquid in frontmatter props
|
||||
Object.keys(data)
|
||||
.filter(key => typeof data[key] === 'string')
|
||||
.forEach(key => {
|
||||
data[key] = updateLiquid(data[key], file)
|
||||
})
|
||||
|
||||
fs.writeFileSync(file, frontmatter.stringify(newContent, data, { lineWidth: 10000 }))
|
||||
})
|
||||
|
||||
function updateLiquid (content) {
|
||||
return content.replace(oldConditional, newConditional)
|
||||
}
|
||||
|
||||
console.log('Done!')
|
||||
@@ -1,21 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const fs = require('fs')
|
||||
const path = require('path')
|
||||
const productsFile = path.join(process.cwd(), 'data/products.yml')
|
||||
|
||||
const contents = `# this sequence sets the product order in the sidebar
|
||||
# the product IDs match keys in lib/all-products.js
|
||||
# note this file should not be translated
|
||||
productsInOrder:
|
||||
- github
|
||||
- admin
|
||||
- actions
|
||||
- packages
|
||||
- developers
|
||||
- rest
|
||||
- graphql
|
||||
- insights
|
||||
- desktop`
|
||||
|
||||
fs.writeFileSync(productsFile, contents)
|
||||
@@ -10,9 +10,10 @@
|
||||
table-layout: auto;
|
||||
|
||||
code {
|
||||
font-size: 100%;
|
||||
background: none;
|
||||
padding: 0;
|
||||
font-size: 85%;
|
||||
padding: 0.2em 0.4em;
|
||||
background-color: rgba($black, 0.05);
|
||||
border-radius: $border-radius;
|
||||
}
|
||||
|
||||
thead tr {
|
||||
|
||||
@@ -90,6 +90,21 @@ describe('Page class', () => {
|
||||
expect($(`a[href="/en/${nonEnterpriseDefaultVersion}/articles/about-pull-requests"]`).length).toBeGreaterThan(0)
|
||||
})
|
||||
|
||||
test('rewrites links in the intro to include the current language prefix and version', async () => {
|
||||
const page = new Page(opts)
|
||||
page.rawIntro = '[Pull requests](/articles/about-pull-requests)'
|
||||
const context = {
|
||||
page: { version: nonEnterpriseDefaultVersion },
|
||||
currentVersion: nonEnterpriseDefaultVersion,
|
||||
currentPath: '/en/github/collaborating-with-issues-and-pull-requests/about-branches',
|
||||
currentLanguage: 'en'
|
||||
}
|
||||
await page.render(context)
|
||||
const $ = cheerio.load(page.intro)
|
||||
expect($('a[href="/articles/about-pull-requests"]').length).toBe(0)
|
||||
expect($(`a[href="/en/${nonEnterpriseDefaultVersion}/articles/about-pull-requests"]`).length).toBeGreaterThan(0)
|
||||
})
|
||||
|
||||
test('does not rewrite links that include deprecated enterprise release numbers', async () => {
|
||||
const page = new Page({
|
||||
relativePath: 'admin/enterprise-management/migrating-from-github-enterprise-1110x-to-2123.md',
|
||||
|
||||