From 71b1123408cd253f96ddfdbf71635a81a01850f6 Mon Sep 17 00:00:00 2001 From: Joel Griffith Date: Mon, 18 Mar 2019 10:30:27 -0700 Subject: [PATCH] Chore/update headless shell to pptr@1.13.0 (#33272) --- x-pack/build_chromium/README.md | 39 ++++++++++++++----- x-pack/build_chromium/build.py | 6 +-- x-pack/build_chromium/windows/init.bat | 3 +- x-pack/package.json | 2 +- .../server/browsers/chromium/paths.js | 18 ++++----- yarn.lock | 28 +++++++++---- 6 files changed, 63 insertions(+), 33 deletions(-) diff --git a/x-pack/build_chromium/README.md b/x-pack/build_chromium/README.md index 742ccd9f45633..7e933b44c6cac 100644 --- a/x-pack/build_chromium/README.md +++ b/x-pack/build_chromium/README.md @@ -6,6 +6,15 @@ The official Chromium build process is poorly documented, and seems to have brea This document is an attempt to note all of the gotchas we've come across while building, so that the next time we have to tinker here, we'll have a good starting point. +# Before you begin +You'll need access to our GCP account, which is where we have two machines provisioned for the Linux and Windows builds. Mac builds can be achieved locally, and are a great place to start to gain familiarity. + +1. Login to our GCP instance [here using your okta credentials](https://console.cloud.google.com/). +2. Click the "Compute Engine" tab. +3. Ensure that `chromium-build-linux` and `chromium-build-windows-12-beefy` are there. +4. If #3 fails, you'll have to spin up new instances. Generally, these need `n1-standard-8` types or 8 vCPUs/30 GB memory. +5. Ensure that there's enough room left on the disk. `ncdu` is a good linux util to verify what's claming space. + ## Build args Chromium is built via a build tool called "ninja". The build can be configured by specifying build flags either in an "args.gn" file or via commandline args. We have an "args.gn" file per platform: @@ -45,7 +54,7 @@ The more cores the better, as the build makes effective use of each. For Linux, ## Initializing each VM / environment -You only need to initialize each environment once. +You only need to initialize each environment once. NOTE: on Mac OS you'll need to install XCode and accept the license agreement. Create the build folder: @@ -55,7 +64,7 @@ Create the build folder: Copy the `x-pack/build-chromium` folder to each. Replace `you@your-machine` with the correct username and VM name: - Mac: `cp -r ~/dev/elastic/kibana/x-pack/build_chromium ~/chromium/build_chromium` -- Linux: `gcloud compute scp --recurse ~/dev/elastic/kibana/x-pack/build_chromium you@your-machine:~/chromium --zone=us-east1-b` +- Linux: `gcloud compute scp --recurse ~/dev/elastic/kibana/x-pack/build_chromium you@your-machine:~/chromium/build_chromium --zone=us-east1-b` - Windows: Copy the `build_chromium` folder via the RDP GUI into `c:\chromium\build_chromium` There is an init script for each platform. This downloads and installs the necessary prerequisites, sets environment variables, etc. @@ -79,9 +88,9 @@ Find the Chromium revision (modify the following command to be wherever you have - `cat ~/dev/elastic/kibana/x-pack/node_modules/puppeteer-core/package.json | grep chromium_revision` - Take the revision number from that, and tack it to the end of this URL: https://crrev.com - - (For example: https://crrev.com/575458) + - (For example: https://crrev.com/637110) - Grab the SHA from there - - (For example, rev 575458 has sha 4747cc23ae334a57a35ed3c8e6adcdbc8a50d479) + - (For example, rev 637110 has sha 2fac04abf6133ab2da2846a8fbd0e97690722699) Note: In Linux, you should run the build command in tmux so that if your ssh session disconnects, the build can keep going. To do this, just type `tmux` into your terminal to hop into a tmux session. If you get disconnected, you can hop back in like so: @@ -91,15 +100,18 @@ Note: In Linux, you should run the build command in tmux so that if your ssh ses To run the build, replace the sha in the following commands with the sha that you wish to build: -- Mac: `python ~/chromium/build_chromium/build.py 4747cc23ae334a57a35ed3c8e6adcdbc8a50d479` -- Linux: `python ~/chromium/build_chromium/build.py 4747cc23ae334a57a35ed3c8e6adcdbc8a50d479` -- Windows: `python c:\chromium\build_chromium\build.py 4747cc23ae334a57a35ed3c8e6adcdbc8a50d479` +- Mac: `python ~/chromium/build_chromium/build.py 2fac04abf6133ab2da2846a8fbd0e97690722699` +- Linux: `python ~/chromium/build_chromium/build.py 2fac04abf6133ab2da2846a8fbd0e97690722699` +- Windows: `python c:\chromium\build_chromium\build.py 2fac04abf6133ab2da2846a8fbd0e97690722699` ## Artifacts After the build completes, there will be a .zip file and a .md5 file in `~/chromium/chromium/src/out/headless`. These are named like so: `chromium-{first_7_of_SHA}-{platform}`, for example: `chromium-4747cc2-linux`. -The zip files need to be deployed to s3. For testing, I drop them into `headless-shell-dev`, but for production, they need to be in `headless-shell`. And the `x-pack/plugins/reporting/server/browsers/chromium/paths.js` file needs to be upated to have the correct `archiveChecksum`, `archiveFilename` and `baseUrl`. +The zip files need to be deployed to s3. For testing, I drop them into `headless-shell-dev`, but for production, they need to be in `headless-shell`. And the `x-pack/plugins/reporting/server/browsers/chromium/paths.js` file needs to be upated to have the correct `archiveChecksum`, `archiveFilename`, `rawChecksum` and `baseUrl`. Below is a list of what the archive's are: + +- `archiveChecksum`: The contents of the `.md5` file, which is the `md5` checksum of the zip file. +- `rawChecksum`: The `md5` checksum of the `headless_shell` binary itself. *If you're building in the cloud, don't forget to turn off your VM after retrieving the build artifacts!* @@ -109,7 +121,16 @@ After getting the build to pass, the resulting binaries often failed to run or w You can run the headless browser manually to see what errors it is generating (replace the `c:\dev\data` with the path to a dummy folder you've created on your system): -`headless_shell.exe --disable-translate --disable-extensions --disable-background-networking --safebrowsing-disable-auto-update --disable-sync --metrics-recording-only --disable-default-apps --mute-audio --no-first-run --user-data-dir=c:\dev\data --disable-gpu --no-sandbox --headless --hide-scrollbars --window-size=400,400 --remote-debugging-port=3333 about:blank` +**Mac** +`headless_shell --disable-translate --disable-extensions --disable-background-networking --safebrowsing-disable-auto-update --disable-sync --metrics-recording-only --disable-default-apps --mute-audio --no-first-run --disable-gpu --no-sandbox --headless --hide-scrollbars --window-size=400,400 --remote-debugging-port=9221 https://example.com/` + +**Linux** +`headless_shell --disable-translate --disable-extensions --disable-background-networking --safebrowsing-disable-auto-update --disable-sync --metrics-recording-only --disable-default-apps --mute-audio --no-first-run --disable-gpu --no-sandbox --headless --hide-scrollbars --window-size=400,400 --remote-debugging-port=9221 https://example.com/` + +**Windows** +`headless_shell.exe --disable-translate --disable-extensions --disable-background-networking --safebrowsing-disable-auto-update --disable-sync --metrics-recording-only --disable-default-apps --mute-audio --no-first-run --disable-gpu --no-sandbox --headless --hide-scrollbars --window-size=400,400 --remote-debugging-port=9221 https://example.com/` + +In the case of Windows, you can use IE to open `http://localhost:9221` and see if the page loads. In mac/linux you can just curl the JSON endpoints: `curl http://localhost:9221/json/list`. ## Resources diff --git a/x-pack/build_chromium/build.py b/x-pack/build_chromium/build.py index e881c3691bd6d..190a8dfc92d4a 100644 --- a/x-pack/build_chromium/build.py +++ b/x-pack/build_chromium/build.py @@ -77,13 +77,9 @@ def archive_file(name): archive_file('headless_shell.exe') archive_file('dbghelp.dll') archive_file('icudtl.dat') - archive_file('natives_blob.bin') - archive_file('snapshot_blob.bin') elif platform.system() == 'Darwin': archive_file('headless_shell') - archive_file('natives_blob.bin') - archive_file('snapshot_blob.bin') - archive_file('Helpers/crashpad_handler') + archive_file('Helpers/chrome_crashpad_handler') archive.close() diff --git a/x-pack/build_chromium/windows/init.bat b/x-pack/build_chromium/windows/init.bat index 0775fe113773a..3bd82fb0adec8 100644 --- a/x-pack/build_chromium/windows/init.bat +++ b/x-pack/build_chromium/windows/init.bat @@ -1,5 +1,6 @@ : This only needs to be run once per environment to set it up. : This requires a GUI, as the VS installation is graphical. +: If initilization fails, you can simply install run the `install_vs.exe` @echo off @@ -19,7 +20,7 @@ powershell -command "& {iwr -outf %~dp0../../depot_tools.zip https://storage.goo powershell -command "& {Expand-Archive %~dp0../../depot_tools.zip -DestinationPath %~dp0../../depot_tools}" : Set the environment variables required by depot_tools -@echo "When Visual Studio is installed, you need to enable the Windows SDK in Control Panel. After taht, press here to continue initialization" +@echo "When Visual Studio is installed, you need to enable the Windows SDK in Control Panel. After that, press here to continue initialization" pause diff --git a/x-pack/package.json b/x-pack/package.json index 47220b20cfc99..6bb06ede5a17f 100644 --- a/x-pack/package.json +++ b/x-pack/package.json @@ -237,7 +237,7 @@ "polished": "^1.9.2", "prop-types": "^15.6.0", "puid": "1.0.5", - "puppeteer-core": "^1.7.0", + "puppeteer-core": "^1.13.0", "raw-loader": "0.5.1", "react": "^16.8.0", "react-apollo": "^2.1.4", diff --git a/x-pack/plugins/reporting/server/browsers/chromium/paths.js b/x-pack/plugins/reporting/server/browsers/chromium/paths.js index cefad5d032c51..02f6e7b414697 100644 --- a/x-pack/plugins/reporting/server/browsers/chromium/paths.js +++ b/x-pack/plugins/reporting/server/browsers/chromium/paths.js @@ -11,21 +11,21 @@ export const paths = { baseUrl: 'https://s3.amazonaws.com/headless-shell/', packages: [{ platforms: ['darwin', 'freebsd', 'openbsd'], - archiveFilename: 'chromium-04c5a83-darwin.zip', - archiveChecksum: '89a98bfa6454bec550f196232d1faeb3', - rawChecksum: '413bbd646a4862a136bc0852ab6f41c5', + archiveFilename: 'chromium-2fac04a-darwin.zip', + archiveChecksum: '36814b1629457aa178b4ecdf6cc1bc5f', + rawChecksum: '9b40e2efa7f4f1870835ee4cdaf1dd51', binaryRelativePath: 'headless_shell-darwin/headless_shell', }, { platforms: ['linux'], - archiveFilename: 'chromium-04c5a83-linux.zip', - archiveChecksum: '1339f6d57b6039445647dcdc949ba513', - rawChecksum: '4824710dd8f3da9d9e2c0674a771008b', + archiveFilename: 'chromium-2fac04a-linux.zip', + archiveChecksum: '5cd6b898a35f9dc0ba6f49d821b8a2a3', + rawChecksum: 'b3fd218d3c3446c388da4e6c8a82754c', binaryRelativePath: 'headless_shell-linux/headless_shell' }, { platforms: ['win32'], - archiveFilename: 'chromium-04c5a83-windows.zip', - archiveChecksum: '3b3279b59ebf03db676baeb7b7ab5c24', - rawChecksum: '724011f9acf872c9472c82c6f7981178', + archiveFilename: 'chromium-2fac04a-windows.zip', + archiveChecksum: '1499a4d5847792d59b9c1a8ab7dc8b94', + rawChecksum: '08b48d2f3d23c4bc8b58779ca4a7b627', binaryRelativePath: 'headless_shell-windows\\headless_shell.exe' }] }; diff --git a/yarn.lock b/yarn.lock index 4f238c9ecc5a7..3649d780ffdb9 100644 --- a/yarn.lock +++ b/yarn.lock @@ -18349,6 +18349,11 @@ progress@^2.0.0: resolved "https://registry.yarnpkg.com/progress/-/progress-2.0.0.tgz#8a1be366bf8fc23db2bd23f10c6fe920b4389d1f" integrity sha1-ihvjZr+Pwj2yvSPxDG/pILQ4nR8= +progress@^2.0.1: + version "2.0.3" + resolved "https://registry.yarnpkg.com/progress/-/progress-2.0.3.tgz#7e8cf8d8f5b8f239c1bc68beb4eb78567d572ef8" + integrity sha512-7PiHtLll5LdnKIMw100I+8xJXR5gW2QwWYkT6iJva0bXitZKa/XMrSbdmg3r2Xnaidz9Qumd0VPaMrZlF9V9sA== + promise-inflight@^1.0.1: version "1.0.1" resolved "https://registry.yarnpkg.com/promise-inflight/-/promise-inflight-1.0.1.tgz#98472870bf228132fcbdd868129bad12c3c029e3" @@ -18659,19 +18664,19 @@ punycode@^1.2.4, punycode@^1.4.1: resolved "https://registry.yarnpkg.com/punycode/-/punycode-1.4.1.tgz#c0d5a63b2718800ad8e1eb0fa5269c84dd41845e" integrity sha1-wNWmOycYgArY4esPpSachN1BhF4= -puppeteer-core@^1.7.0: - version "1.7.0" - resolved "https://registry.yarnpkg.com/puppeteer-core/-/puppeteer-core-1.7.0.tgz#c10f660983e9a4faacf6b8e50861c7739871c752" - integrity sha512-SpUOJL8gTPEuABGcZxKM3jg5s0rIwmRC6P9Jw/JTG3XFCVtUcYQru4Uwlz7jAXe6JEeeLOm6hApgGCmRyALiig== +puppeteer-core@^1.13.0: + version "1.13.0" + resolved "https://registry.yarnpkg.com/puppeteer-core/-/puppeteer-core-1.13.0.tgz#f8001851e924e6e9ef6e9fae1778c3ab87c3f307" + integrity sha512-8MypjWVHu2EEdtN2HxhCsTtIYdJgiCcbGpHoosv265fzanfOICC2/DadLZq6/Qc/OKsovQmjkO+2vKMrV3BRfA== dependencies: - debug "^3.1.0" + debug "^4.1.0" extract-zip "^1.6.6" https-proxy-agent "^2.2.1" mime "^2.0.3" - progress "^2.0.0" + progress "^2.0.1" proxy-from-env "^1.0.0" rimraf "^2.6.1" - ws "^5.1.1" + ws "^6.1.0" q@^1.0.1, q@^1.1.2: version "1.5.1" @@ -25110,7 +25115,7 @@ write@^0.2.1: dependencies: mkdirp "^0.5.1" -ws@^5.1.1, ws@^5.2.0: +ws@^5.2.0: version "5.2.2" resolved "https://registry.yarnpkg.com/ws/-/ws-5.2.2.tgz#dffef14866b8e8dc9133582514d1befaf96e980f" integrity sha512-jaHFD6PFv6UgoIVda6qZllptQsMlDEJkTQcybzzXDYM1XO9Y8em691FGMPmM46WGyLU4z9KMgQN+qrux/nhlHA== @@ -25124,6 +25129,13 @@ ws@^6.0.0: dependencies: async-limiter "~1.0.0" +ws@^6.1.0: + version "6.2.0" + resolved "https://registry.yarnpkg.com/ws/-/ws-6.2.0.tgz#13806d9913b2a5f3cbb9ba47b563c002cbc7c526" + integrity sha512-deZYUNlt2O4buFCa3t5bKLf8A7FPP/TVjwOeVNpw818Ma5nk4MLXls2eoEGS39o8119QIYxTrTDoPQ5B/gTD6w== + dependencies: + async-limiter "~1.0.0" + ws@~3.3.1: version "3.3.3" resolved "https://registry.yarnpkg.com/ws/-/ws-3.3.3.tgz#f1cf84fe2d5e901ebce94efaece785f187a228f2"