Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: replace nuget and embedded windows dependencies with direct http downloads #1502

Merged
merged 3 commits into from
Nov 6, 2023

Conversation

Julusian
Copy link
Member

@Julusian Julusian commented Oct 31, 2023

This uses cmake ExternalProject_Add to download at build time, using a copy of libraries mirrored in a github repository to ensure availability.
This does not impact the ability to do builds offline, the CASPARCG_DOWNLOAD_CACHE option can be used to specify a folder where all the dependencies can be found. Alternatively, the downloads can be redirected to a mirror using the CASPARCG_DOWNLOAD_MIRROR option.

The primary source/'mirror' of these assets can be found at https://github.com/CasparCG/dependencies/releases. Hosted here so that we don't have to worry about them possibly disappearing or moving unexpectedly, as technically we need to be able to provide them on demand to conform to the GPL license.

I was previously looking into vcpkg, but it wants to build everything from source all the time. We would have needed caching because building ffmpeg was taking hours and caused the builds to timeout. vcpkg does support cachine, but using it was proving to be painful.

I think that the same treatment should be done for the linux builds, to replace the 'caches' stored in docker images.

…p downloads

This uses cmake ExternalProject_Add to download at build time, using a copy of libraries mirrored in a github repository to ensure availability.
@dimitry-ishenko
Copy link
Contributor

If I may suggest: use git submodules instead of ExternalProject_Add. The reason being is that every time you want to do a clean build of CG, it would need to re-download external projects.

Instead you could clone external projects in the CasparCG organization and add them as submodules. This way you can check out CG with submodules (using the --recursive option) once and work on it locally without having to re-download external projects multiple times.

@Julusian
Copy link
Member Author

Julusian commented Oct 31, 2023

It will only download them once, unless you also delete its cache directory. It puts it inside the build directory by default, which does risk it being deleted more often, but it is exposed as the CASPARCG_DOWNLOAD_CACHE option.

I am hesitant to use submodules for a few of reasons:

  1. Linux and windows need a different set of these dependencies, being forced to clone 8 submodules on linux when you only need one of them is slow and annoying.
  2. I don't want to push us back to where we were with 2.1 of having everything committed to the repository making all git operations slow due to its size. I fear the same could happen from having many submodules
  3. we would either need to commit compiled binaries, or a few of these dependencies (eg cef) would need to still be done as http downloads. The cef.so is too large to commit to a repository, and is not something we can compile ourselves. ffmpeg also takes github actions a couple of hours.
  4. freeimage, zlib maybe more arent on github, so we would have to maintain our own repositories of them. That would be easier than maintaining nuget packages, but is still extra effort

What I really like about this approach, is that it is 99% using official binaries of the libraries (just hosted elsewhere) and by changing two lines in the cmake file you can load either a different official or a custom version of a library. And it does have caching, avoiding much cost for compiling locally or in CI. (I almost didnt precompile boost, but thought the couple of minutes build cost was a bit much. but making that opt out was easy enough)
Maybe these urls should be made more configurable in the future? I am open to that, but not enough to do it myself.

This will also work with one change if you are running the build in a CI server without internet access. Simply host these downloads on your own server that CI can access, and set CASPARCG_DOWNLOAD_MIRROR to point at it.

@dimitry-ishenko
Copy link
Contributor

dimitry-ishenko commented Oct 31, 2023

Being able to specify a cache directory is definitely helpful. Although, it's still a bit of a PITA since you need to somehow download the dependencies into the cache. 😝

And yeah, since you are downloading binaries, submodules is not necessarily the right tool for this. At one point in the past I did have a script that built the server from sources and system libraries only, but that was very long time ago. It would be nice to have that as an option still.

But otherwise, in terms of submodules making git operations slow, that's not the case. (Not that it matters -- just want to set the record straight. 😸)

@Julusian
Copy link
Member Author

Julusian commented Oct 31, 2023

Being able to specify a cache directory is definitely helpful. Although, it's still a bit of a PITA since you need to somehow download the dependencies into the cache.

cmake will populate that for you, unless you dont have internet then yes you will need to do that yourself.

And yeah, since you are downloading binaries, submodules is not necessarily the right tool for this. At one point in the past I did have a script that built the server from sources and system libraries only, but that was very long time ago. It would be nice to have that as an option still.

Yeah for linux it still will be possible to build almost everything against system libraries. The only ones we do other ways is ffmpeg and boost, but you can opt into the system versions. And cef, but no distro packages that (as far as im aware).
I can see the appeal of compiling everything from source, but 99% of people dont care about that and it makes the build time ridiculous. So I am happy to do more to support it, but using binaries for libraries should be the focus. What I did for boost here is a way that could be done, perhaps with a more unified flag?


Doing this has made me want to try and figure out deb packaging again.. I've still got tabs open about that from months ago, and I think I did figure out how to do it for the client

@dimitry-ishenko
Copy link
Contributor

cmake will populate that for you, unless you dont have internet then yes you will need to do that yourself.

Sorry, that's basically what I meant. I would need to run cmake (and have a full dev environment) on a public facing box in order to get the files cached. Whereas with git you could just clone recursively without needing cmake, gcc, etc.

Yeah for linux it still will be possible to build almost everything against system libraries. The only ones we do other ways is ffmpeg and boost, but you can opt into the system versions.

Oh that's good to know.

And cef, but no distro packages that (as far as im aware).
Doing this has made me want to try and figure out deb packaging again.

My goal is to create a PPA and make Debian packages for cef, casparcg, etc. and make them available for different distros. I already host a number of PPAs here: https://launchpad.net/~ppa-verse

Some day I will hopefully get to it. Unfortunately, other projects at work keep getting in the way, and doing it on my own time would require an incentive... 😄

@Julusian
Copy link
Member Author

Julusian commented Nov 1, 2023

Sorry, that's basically what I meant. I would need to run cmake (and have a full dev environment) on a public facing box in order to get the files cached. Whereas with git you could just clone recursively without needing cmake, gcc, etc.

Ah right. Yeah that is the 'simplest' option for populating the cache. It can be done manually, if you know which assets are needed. I should add a section about this to some building guide..
It won't be entirely straightforward, as it will require downloading the correct versions, which I don't want to document outside of the cmake files as I know I will forget to update whatever document that is.
But either way, this is a lot easier to do than nuget, which I have no idea where to begin on trying to populate its cache offline

This was referenced Nov 1, 2023
@Julusian Julusian merged commit 4d008dc into master Nov 6, 2023
@Julusian Julusian mentioned this pull request Nov 6, 2023
@Julusian
Copy link
Member Author

Julusian commented Dec 12, 2023

@dimitry-ishenko I have been thinking more about a PPA recently while looking at the poor state of the old casparcg build in debian contrib (spoiler: it crashes at launch and has for years).
Based upon the packaging they use, I have managed to publish a version to a ppa https://launchpad.net/~casparcg/+archive/ubuntu/ppa
CEF support is not yet finished, but I am close.

Some more thought probably needs to be put into versioning

  • should cef have the major version in the name and path to avoid conflicts?
  • should each major/minor release (eg 2.3 vs 2.4) have its own package name?
  • should the master builds also be published (as casparcg-server-beta or something)?
  • which versions of ubuntu is it worth packaging for?

@Julusian Julusian deleted the feat/replace-nuget-with-cmake branch December 12, 2023 18:19
@dimitry-ishenko
Copy link
Contributor

dimitry-ishenko commented Dec 13, 2023

Some more thought probably needs to be put into versioning

  • should cef have the major version in the name and path to avoid conflicts?

If you are going to bump minor version on each CEF upgrade, then I don't think so (see below).

  • should each major/minor release (eg 2.3 vs 2.4) have its own package name?

Yes. They should be casparcg-server-2.3 and casparcg-server-2.4

And then a meta-package called casparcg-server, which depends on the latest stable version.

This way if someone wants to stick to a specific version and only receive minor updates/bug fixes can install either version. And, if someone just wants latest stable version, they can install casparcg-server.

Or, if someone wants to install both versions side-by-side, they can do that too.

  • should the master builds also be published (as casparcg-server-beta or something)?

Sure, for people who like to live on the edge or want to test new features. 😄 Unless they are stable enough, casparcg-server-unstable would be a more suitable name IMHO.

  • which versions of ubuntu is it worth packaging for?

Current LTS and prior LTS, plus current active version. So, right now it would be 20.04 LTS, 22.04 LTS and 23.04.

Once 24.04 LTS comes out, drop 22.04 LTS and 23.04. Once 24.10 comes out, add it to the list. Once 25.04 comes out, drop 24.10 and add 25.04, etc.

@Julusian
Copy link
Member Author

Julusian commented Dec 13, 2023

Yes. They should be casparcg-server-2.3 and casparcg-server-2.4
And then a meta-package called casparcg-server, which depends on the latest stable version.

Yeah, that is what I was thinking. I think it is worth the effort to do

If you are going to bump minor version on each CEF upgrade, then I don't think so (see below).
Or, if someone wants to install both versions side-by-side, they can do that too.

I think this is a good argument to have the CEF version in the name. I think I've convinced myself that it is the right way to go.
As far as I'm aware, CEF doesnt provide a stable ABI. And part of it is statically linked inside of CasparCG. So until proven otherwise, I am going with the strategy that casparcg-server depends on an exact match of CEF version.
Which means that if you want to install casparcg-server-2.3 and casparcg-server-2.4, but one requires CEF 117 and the other CEF 116, that can't be satisfied unless those are two different packages.

Current LTS and prior LTS, plus current active version. So, right now it would be 20.04 LTS, 22.04 LTS and 23.04.

Yeah that is what I was thinking. I might save the effort for now and skip 20.04 (I dont see why anyone would deploy today from this ppa onto 20.04 instead of 22.04), but once 24.04 is released both LTS versions makes sense

Of course, now I need to figure out how to structure all these packages in git to maintain them.
I currently have everything in a dedicated ppa git repository, but perhaps it should all be spread out into the source repositories?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants