Skip to content

Commit

Permalink
Merge branch 'trunk' of github.com:Yoast/wordpress-seo into upgrade-e…
Browse files Browse the repository at this point in the history
…slint-to-9
  • Loading branch information
diedexx committed Dec 24, 2024
2 parents 39fbcbd + 25587e2 commit 950d53e
Show file tree
Hide file tree
Showing 75 changed files with 2,854 additions and 1,263 deletions.
24 changes: 24 additions & 0 deletions .github/workflows/merge-conflict-check.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
name: Check PRs for merge conflicts

on:
# Check for new conflicts due to merges.
push:
branches:
- main
- trunk
- 'release/**'
- 'hotfix/[0-9]+.[0-9]+*'
- 'feature/**'
# Check conflicts in new PRs and for resolved conflicts due to an open PR being updated.
pull_request_target:
types:
- opened
- synchronize
- reopened

jobs:
check-prs:
if: github.repository_owner == 'Yoast'

name: Check PRs for merge conflicts
uses: Yoast/.github/.github/workflows/reusable-merge-conflict-check.yml@main
4 changes: 0 additions & 4 deletions admin/class-expose-shortlinks.php
Original file line number Diff line number Diff line change
Expand Up @@ -58,10 +58,6 @@ class WPSEO_Expose_Shortlinks implements WPSEO_WordPress_Integration {
'shortlinks.activate_premium_info' => 'https://yoa.st/activate-subscription',
'shortlinks.upsell.sidebar.morphology_upsell_metabox' => 'https://yoa.st/morphology-upsell-metabox',
'shortlinks.upsell.sidebar.morphology_upsell_sidebar' => 'https://yoa.st/morphology-upsell-sidebar',
'shortlinks.semrush.volume_help' => 'https://yoa.st/3-v',
'shortlinks.semrush.trend_help' => 'https://yoa.st/3-v',
'shortlinks.semrush.prices' => 'https://yoa.st/semrush-prices',
'shortlinks.semrush.premium_landing_page' => 'https://yoa.st/413',
'shortlinks.wincher.seo_performance' => 'https://yoa.st/wincher-integration',
'shortlinks-insights-estimated_reading_time' => 'https://yoa.st/4fd',
'shortlinks-insights-flesch_reading_ease' => 'https://yoa.st/34r',
Expand Down
4 changes: 2 additions & 2 deletions admin/class-gutenberg-compatibility.php
Original file line number Diff line number Diff line change
Expand Up @@ -15,14 +15,14 @@ class WPSEO_Gutenberg_Compatibility {
*
* @var string
*/
public const CURRENT_RELEASE = '19.8.0';
public const CURRENT_RELEASE = '19.9.0';

/**
* The minimally supported version of Gutenberg by the plugin.
*
* @var string
*/
public const MINIMUM_SUPPORTED = '19.8.0';
public const MINIMUM_SUPPORTED = '19.9.0';

/**
* Holds the current version.
Expand Down
138 changes: 135 additions & 3 deletions apps/content-analysis-api/analyze.http

Large diffs are not rendered by default.

57 changes: 57 additions & 0 deletions apps/content-analysis-api/helpers/get-researcher.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
const { helpers, languageProcessing } = require( "yoastseo" );

// Premium researches and helpers
const keyphraseDistribution = languageProcessing.researches.keyphraseDistribution;
const wordComplexity = languageProcessing.researches.wordComplexity;
const getLongCenterAlignedTexts = languageProcessing.researches.getLongCenterAlignedTexts;
const getLanguagesWithWordComplexity = helpers.getLanguagesWithWordComplexity;

const MORPHOLOGY_VERSIONS = {
en: "v6",
de: "v10",
es: "v10",
fr: "v11",
it: "v10",
nl: "v9",
ru: "v10",
id: "v9",
pt: "v9",
pl: "v9",
ar: "v9",
sv: "v1",
he: "v1",
hu: "v2",
nb: "v1",
tr: "v1",
cs: "v1",
sk: "v1",
ja: "v1",
};

/**
* Retrieves a Researcher instance for a specific language.
*
* @param {string} language The language to get the Researcher for.
* @returns {Researcher} The Researcher instance.
*/
const getResearcher = ( language ) => {
const dataVersion = MORPHOLOGY_VERSIONS[ language ];
// eslint-disable-next-line global-require
const { "default": Researcher } = require( `yoastseo/build/languageProcessing/languages/${language}/Researcher` );
// eslint-disable-next-line global-require
const premiumData = require( `yoastseo/premium-configuration/data/morphologyData-${language}-${dataVersion}.json` );

const researcher = new Researcher();
researcher.addResearchData( "morphology", premiumData );
researcher.addResearch( "keyphraseDistribution", keyphraseDistribution );
if ( getLanguagesWithWordComplexity().includes( language ) ) {
researcher.addResearch( "wordComplexity", wordComplexity );
researcher.addHelper( "checkIfWordIsComplex", helpers.getWordComplexityHelper( language ) );
researcher.addConfig( "wordComplexity", helpers.getWordComplexityConfig( language ) );
}
researcher.addResearch( "getLongCenterAlignedTexts", getLongCenterAlignedTexts );

return researcher;
};

module.exports = { getResearcher };
99 changes: 3 additions & 96 deletions apps/content-analysis-api/index.js
Original file line number Diff line number Diff line change
@@ -1,101 +1,8 @@
const { Paper, App, interpreters, languageProcessing, assessments, helpers, assessors } = require( "yoastseo" );
const express = require( "express" );
const { SEOAssessor, ContentAssessor, RelatedKeywordAssessor, InclusiveLanguageAssessor } = assessors;
const express = require( "express" ), app = express();

// Premium assessments
const keyphraseDistribution = languageProcessing.researches.keyphraseDistribution;
const KeyphraseDistributionAssessment = assessments.seo.KeyphraseDistributionAssessment;

const TextTitleAssessment = assessments.seo.TextTitleAssessment;

const getLanguagesWithWordComplexity = helpers.getLanguagesWithWordComplexity;
const wordComplexity = languageProcessing.researches.wordComplexity;
const WordComplexityAssessment = assessments.readability.WordComplexityAssessment;

const getLongCenterAlignedTexts = languageProcessing.researches.getLongCenterAlignedTexts;
const TextAlignmentAssessment = assessments.readability.TextAlignmentAssessment;

const MORPHOLOGY_VERSIONS = {
en: "v6",
de: "v10",
es: "v10",
fr: "v11",
it: "v10",
nl: "v9",
ru: "v10",
id: "v9",
pt: "v9",
pl: "v9",
ar: "v9",
sv: "v1",
he: "v1",
hu: "v2",
nb: "v1",
tr: "v1",
cs: "v1",
sk: "v1",
ja: "v1",
};

/**
* Maps the result to a view model, ready to be sent to the client.
* @param {AssessmentResult[]} result The result to map.
* @returns {{score, editFieldName, text, marks}} The view model.
*/
const resultToVM = ( result ) => {
const { _identifier, score, text, marks, editFieldName } = result;
return { _identifier, score, text, marks, editFieldName, rating: interpreters.scoreToRating( score ) };
};

const app = express();
app.use( express.json() );

app.get( "/analyze", ( request, response ) => {
// Fetch the Researcher and set the morphology data for the given language (yes, this is a bit hacky)
const language = request.body.locale || "en";
const dataVersion = MORPHOLOGY_VERSIONS[ language ];
// eslint-disable-next-line global-require
const { "default": Researcher } = require( `yoastseo/build/languageProcessing/languages/${language}/Researcher` );
// eslint-disable-next-line global-require
const premiumData = require( `yoastseo/premium-configuration/data/morphologyData-${language}-${dataVersion}.json` );

const researcher = new Researcher();
researcher.addResearchData( "morphology", premiumData );
researcher.addResearch( "keyphraseDistribution", keyphraseDistribution );
if ( getLanguagesWithWordComplexity().includes( language ) ) {
researcher.addResearch( "wordComplexity", wordComplexity );
researcher.addHelper( "checkIfWordIsComplex", helpers.getWordComplexityHelper( language ) );
researcher.addConfig( "wordComplexity", helpers.getWordComplexityConfig( language ) );
}
researcher.addResearch( "getLongCenterAlignedTexts", getLongCenterAlignedTexts );

const seoAssessor = new SEOAssessor( researcher );
seoAssessor.addAssessment("keyphraseDistribution", new KeyphraseDistributionAssessment());
seoAssessor.addAssessment("TextTitleAssessment", new TextTitleAssessment());
const contentAssessor = new ContentAssessor( researcher );
contentAssessor.addAssessment("wordComplexity", new WordComplexityAssessment());
contentAssessor.addAssessment("textAlignment", new TextAlignmentAssessment());
const relatedKeywordAssessor = new RelatedKeywordAssessor( researcher );
const inclusiveLanguageAssessor = new InclusiveLanguageAssessor( researcher );

const paper = new Paper(
request.body.text || "",
request.body || {}
);

seoAssessor.assess( paper );
contentAssessor.assess( paper );
relatedKeywordAssessor.assess( paper );
inclusiveLanguageAssessor.assess( paper );

response.json( {
seo: seoAssessor.getValidResults().map( resultToVM ),
readability: contentAssessor.getValidResults().map( resultToVM ),
relatedKeyword: relatedKeywordAssessor.getValidResults().map( resultToVM ),
inclusiveLanguage: inclusiveLanguageAssessor.getValidResults().map( resultToVM ),
} );
} );

require('./routes/analyze')(app);
require('./routes/research')(app);

// Failing example using the App class. App uses createMeasurementElement, which is a browser-only function.
app.get( "/app", ( req, res ) => {
Expand Down
4 changes: 2 additions & 2 deletions apps/content-analysis-api/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,10 @@
"main": "index.js",
"scripts": {
"postinstall": "yarn --cwd node_modules/yoastseo build && yarn --cwd node_modules/yoastseo pretest",
"start": "node index.js"
"start": "node --watch index.js"
},
"dependencies": {
"express": "^4.19.2",
"express": "^5.0.1",
"yoastseo": "link:../../packages/yoastseo"
}
}
72 changes: 62 additions & 10 deletions apps/content-analysis-api/readme.md
Original file line number Diff line number Diff line change
@@ -1,31 +1,83 @@
# Content Analysis API
This is a small sample/PoC project to demonstrate how to use yoastseo in a Node.js environment.
Using yoastseo should not require any additional build steps.
This is a small sample/PoC project to demonstrate how to use the `yoastseo` package in a Node.js environment.

## Getting started

You can run this project locally by executing the following commands.

First, install the dependencies.

```bash
yarn install
yarn
```

Then, start the server, with one of the following commands:

# Either
yarn start # Picks a random port.
# Or
PORT=3000 yarn start # Specify a port.
```bash
yarn start # Picks a random port.
PORT=3000 yarn start # Specify a port (used in documentation).
```

The server will automatically reload when you make changes to the code.

## Usage
### /analyze

This section describes the available endpoints and how to use them.
See the [API documentation](./analyze.http) for additional examples.

### /analyze/{type}
Send a `GET` request to http://localhost:3000/analyze (or whatever port your app is running on). Pass along a JSON body:
```json
{
"text": "<p>The text you want to analyze.</p>",
"keyword": "The keyphrase you want to rank for.",
"keyword": "The keyphrase you want to rank for."
}
```
As a response, you will receive a JSON object with the results of the content analysis.

You can add any of the properties described by the [`Paper`'s `attributes` parameter](../../packages/yoastseo/src/values/Paper.js) ([see this JSDoc block](https://github.com/Yoast/wordpress-seo/blob/434b6d0eb79659dffe44676da96c1640094137a1/packages/yoastseo/src/values/Paper.js#L26-L40)).

This endpoint will run a set list of assessors on the text and return the results in JSON format.
Possible values for `{type}` are:
- (none) - Run all analyses.
- `seo` - Run the SEO analysis.
- `readability` - Run the readability analysis.
- `inclusive-language` - Runs the inclusive language analysis.
- `related-keyword` - Runs the related keyword analysis.
- `meta-description` - Runs the SEO analysis for the meta description.
- `title` - Runs the SEO analysis for the SEO title.
- `keyword` - Runs the SEO analysis for the focus keyphrase.
- `keyword-use` - Runs the SEO analysis for the keyphrase use in the text.

### /research/{research}

This endpoint fires an individual research on a given `Paper`.

Possible values for `{research}` are:
- `estimated-reading-time`
- `flesch-reading-ease`
- `word-count`
- `sentence-count`
- `paragraph-count`

### /app
Send a `GET` request to http://localhost:3000/app (or whatever port your app is running on).

This will fail with an error. This is to demonstrate that the App class is not (yet) available in the Node.js environment.

_TODO:_
- [ ] Expand: Allow setting a post type (`term`, `product`, etc.), and run the analysis for that specific post type (e.g. for terms, use the `TaxonomyAssessor`).
- [ ] New endpoint: Create an endpoint that tokenizes the given HTML text (see below).
- [ ] New endpoint: Create an endpoint that outputs the configuration data per language.
- [ ] Application: Create a WP-CLI command that fetches a post from WordPress and analyzes it through a (locally running) Yoast SEO (see below).
- [ ] Application: Create a browser extension that allows users to analyze the content of a page they are currently viewing. This could be a simple extension that sends the content of an element to the API and displays the results in a popup.
- [ ] Hosting: Deploy the API to a server and make it available for public use.

### New endpoint: /tokenize

This endpoint parses the given HTML text and returns the sentences and tokens (words or punctuation) in the text, including position information.
This could be relevant for users who are not so much interested in our content analysis, but do like our preprocessing.

### Potential application: WP-CLI command

Fetch a post (in WordPress) and prepares it for processing through Yoast SEO.
The output is a serialized `Paper` object (in JSON format) that can be used in the `analyze` routes above.
Loading

0 comments on commit 950d53e

Please sign in to comment.