-
Notifications
You must be signed in to change notification settings - Fork 273
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: replace nuget and embedded windows dependencies with direct http downloads #1502
Conversation
…p downloads This uses cmake ExternalProject_Add to download at build time, using a copy of libraries mirrored in a github repository to ensure availability.
If I may suggest: use git submodules instead of ExternalProject_Add. The reason being is that every time you want to do a clean build of CG, it would need to re-download external projects. Instead you could clone external projects in the CasparCG organization and add them as submodules. This way you can check out CG with submodules (using the |
It will only download them once, unless you also delete its cache directory. It puts it inside the build directory by default, which does risk it being deleted more often, but it is exposed as the I am hesitant to use submodules for a few of reasons:
What I really like about this approach, is that it is 99% using official binaries of the libraries (just hosted elsewhere) and by changing two lines in the cmake file you can load either a different official or a custom version of a library. And it does have caching, avoiding much cost for compiling locally or in CI. (I almost didnt precompile boost, but thought the couple of minutes build cost was a bit much. but making that opt out was easy enough) This will also work with one change if you are running the build in a CI server without internet access. Simply host these downloads on your own server that CI can access, and set |
Being able to specify a cache directory is definitely helpful. Although, it's still a bit of a PITA since you need to somehow download the dependencies into the cache. 😝 And yeah, since you are downloading binaries, submodules is not necessarily the right tool for this. At one point in the past I did have a script that built the server from sources and system libraries only, but that was very long time ago. It would be nice to have that as an option still. But otherwise, in terms of submodules making git operations slow, that's not the case. (Not that it matters -- just want to set the record straight. 😸) |
cmake will populate that for you, unless you dont have internet then yes you will need to do that yourself.
Yeah for linux it still will be possible to build almost everything against system libraries. The only ones we do other ways is ffmpeg and boost, but you can opt into the system versions. And cef, but no distro packages that (as far as im aware). Doing this has made me want to try and figure out deb packaging again.. I've still got tabs open about that from months ago, and I think I did figure out how to do it for the client |
Sorry, that's basically what I meant. I would need to run cmake (and have a full dev environment) on a public facing box in order to get the files cached. Whereas with git you could just clone recursively without needing cmake, gcc, etc.
Oh that's good to know.
My goal is to create a PPA and make Debian packages for cef, casparcg, etc. and make them available for different distros. I already host a number of PPAs here: https://launchpad.net/~ppa-verse Some day I will hopefully get to it. Unfortunately, other projects at work keep getting in the way, and doing it on my own time would require an incentive... 😄 |
Ah right. Yeah that is the 'simplest' option for populating the cache. It can be done manually, if you know which assets are needed. I should add a section about this to some building guide.. |
@dimitry-ishenko I have been thinking more about a PPA recently while looking at the poor state of the old casparcg build in debian contrib (spoiler: it crashes at launch and has for years). Some more thought probably needs to be put into versioning
|
If you are going to bump minor version on each CEF upgrade, then I don't think so (see below).
Yes. They should be And then a meta-package called This way if someone wants to stick to a specific version and only receive minor updates/bug fixes can install either version. And, if someone just wants latest stable version, they can install Or, if someone wants to install both versions side-by-side, they can do that too.
Sure, for people who like to live on the edge or want to test new features. 😄 Unless they are stable enough,
Current LTS and prior LTS, plus current active version. So, right now it would be 20.04 LTS, 22.04 LTS and 23.04. Once 24.04 LTS comes out, drop 22.04 LTS and 23.04. Once 24.10 comes out, add it to the list. Once 25.04 comes out, drop 24.10 and add 25.04, etc. |
Yeah, that is what I was thinking. I think it is worth the effort to do
I think this is a good argument to have the CEF version in the name. I think I've convinced myself that it is the right way to go.
Yeah that is what I was thinking. I might save the effort for now and skip 20.04 (I dont see why anyone would deploy today from this ppa onto 20.04 instead of 22.04), but once 24.04 is released both LTS versions makes sense Of course, now I need to figure out how to structure all these packages in git to maintain them. |
This uses cmake ExternalProject_Add to download at build time, using a copy of libraries mirrored in a github repository to ensure availability.
This does not impact the ability to do builds offline, the
CASPARCG_DOWNLOAD_CACHE
option can be used to specify a folder where all the dependencies can be found. Alternatively, the downloads can be redirected to a mirror using theCASPARCG_DOWNLOAD_MIRROR
option.The primary source/'mirror' of these assets can be found at https://github.com/CasparCG/dependencies/releases. Hosted here so that we don't have to worry about them possibly disappearing or moving unexpectedly, as technically we need to be able to provide them on demand to conform to the GPL license.
I was previously looking into vcpkg, but it wants to build everything from source all the time. We would have needed caching because building ffmpeg was taking hours and caused the builds to timeout. vcpkg does support cachine, but using it was proving to be painful.
I think that the same treatment should be done for the linux builds, to replace the 'caches' stored in docker images.