Skip to content

Commit

Permalink
Update PsychoPy recipe, and pull in upstream changes. (#1)
Browse files Browse the repository at this point in the history
* add pip

* fix

* some gl fixes

* fix

* added trio-asyncio [skip appveyor]

* vendor -> vend

* fix url

* Update recipes/pylbm/meta.yaml

Co-Authored-By: Chris Burr <[email protected]>

* Update recipes/pylbm/meta.yaml

Co-Authored-By: Chris Burr <[email protected]>

* python-backtrace

* rm setuptools dep

* add license

* add LICENSE.txt to recipe folder

* better deps

* Removed recipes (atlite, python-backtrace) after converting into feedstocks. [ci skip]

* Removed recipe (pylbm) after converting into feedstock. [ci skip]

* Removed recipe (libhomfly) after converting into feedstock. [ci skip]

* fix stuff

* fixed meta

* Add libbraiding

* xontrib-readable-traceback

* add license

* Removed recipe (trio-asyncio) after converting into feedstock. [ci skip]

* Add C++ compiler

* trio-aiohttp

* Removed recipe (libbraiding) after converting into feedstock. [ci skip]

* fix test

* more fixes

* more fixes

* libxkbcommon

* added build

* forgot sha256

* another couple of thinks

* don't run pytest

The tests are not in the dist. They do pass in GitLab CI for this release:

https://gitlab.com/deltares/imod/imod-python/pipelines/57198406

* python req

* more deps

* more build

* add pkg-config

* cflags

* -lxcb

* xontrib-ssh-agent

* Added recipe for pycomlink based on example

* Initial work setting up GPI framework

* Removed recipe (xontrib-ssh-agent) after converting into feedstock. [ci skip]

* Fixed wrong sha256 has of source tab ball

Strangely the source tar ball downloaded via "curl -O URL" is different
(wrong in this case) from the hash of the file that is downloaded
via the browser. Or maybe I did something wrong...

* rm packages

* typo in license

Co-Authored-By: jakirkham <[email protected]>

* removed pip install flags

* Removed recipes (pycomlink, trio-aiohttp) after converting into feedstocks. [ci skip]

* Regenerate with upstreamed RPM skeleton fixes

* add yum_requirements.txt

* Removed recipe (libxkbcommon) after converting into feedstock. [ci skip]

* quart-trio

* added libxkbcommon

* xorg-libxinerama

* Removed recipe (imutils) after converting into feedstock. [ci skip]

* Removed recipe (quart-trio) after converting into feedstock. [ci skip]

* some minor fixes

* harfbuzz isn't built correctly to support mac here

* Add xrviz

* correct deps

* Removed recipe (imod) after converting into feedstock. [ci skip]

* Removed recipe (deepxde) after converting into feedstock. [ci skip]

* Removed recipe (kitty) after converting into feedstock. [ci skip]

* try again

* use release

* xkeyboard-config

* google-pasta

* remove extra about/home entry

* Removed recipe (google-pasta) after converting into feedstock. [ci skip]

* add rio-cogeo and supermercado

* don't build py<3.3

* remove rasterio from build

* Removed recipes (rio-cogeo, supermercado) after converting into feedstocks. [ci skip]

* add nbgitpuller

* Removed recipe (nbgitpuller) after converting into feedstock. [ci skip]

* Revert "Initial work setting up GPI framework"

This reverts commit d095a7d.

* Add several aio-libs projects

* aiohttp_jinja2 -> aiohttp-jinja2

* more underscore fixes

* Removed recipes (aiohttp-debugtoolbar, aiohttp-jinja2, aiohttp-security, aiohttp-session, aiomcache, aioredis, janus) after converting into feedstocks. [ci skip]

* Adding recipe for urbansim_defaults

* Add switch-model recipe

* Update requirements

* Fix test/source_files section

* Removed recipe (urbansim_defaults) after converting into feedstock. [ci skip]

* Recipe for coq-jupyter

* Add newline at end of meta.yaml

* Adding recipe for PyCRC

* Fix imports

* Removed recipe (pycrc) after converting into feedstock. [ci skip]

* Create pypd meta.yaml file

* Add pypd LICENSE

* Removed recipe (pypd) after converting into feedstock. [ci skip]

* Added GPI build recipe

* Removed python dependency in run

* Reordered sections in meta.yaml

* Changed XORG packages to conda-forge deps

* Don't use xvfb for testing on osx

* Fix case on selector for skipping windows

* windse recipe test 1

* Adding recipe for pyFirmata

* Removed recipe (coq-jupyter) after converting into feedstock. [ci skip]

* Use underscores in package name and pypi download path

* adding meta.yml for first go at recipe

Signed-off-by: Vanessa Sochat <[email protected]>

* try without dash in name

Signed-off-by: Vanessa Sochat <[email protected]>

* doh, needs to be yaml

Signed-off-by: Vanessa Sochat <[email protected]>

* Fix full license name

* Rename directory to switch_model, not switch-model

* missing yaml

Signed-off-by: Vanessa Sochat <[email protected]>

* add to other list

Signed-off-by: Vanessa Sochat <[email protected]>

* oh, its pyyaml

Signed-off-by: Vanessa Sochat <[email protected]>

* Don't build for Python 3

* Add license file

Eventually, we should get setuptools to include this in the .tar.gz file 
(see pypa/setuptools#357), and then we won't 
need to include it here.

* first successful build using .circleci/run_docker_build.sh

* removed empty lines at the end of windse/meta.yaml

* whoops, removed one line too many

* third times a charm?

* missing spython

Signed-off-by: Vanessa Sochat <[email protected]>

* skipping windows

* Removed recipe (pyfirmata) after converting into feedstock. [ci skip]

* Removed recipe (gpi-framework) after converting into feedstock. [ci skip]

* Update recipes/windse/meta.yaml

Co-Authored-By: Uwe L. Korn <[email protected]>

* Add recipe for seekpath

* Use tests and license from .tar file (starting with 2.0.3.1)

* Remove `noarch` tag to avoid building for Python 3

* Removed recipe (seekpath) after converting into feedstock. [ci skip]

* Enforce python < 3 without using #skip

* correct Python version identifier

* Update sha256 for .tar.gz file

* New meta.yaml for pyeviews

* Add more commands to diagnose import error in tests.

* Another diagnostic command.

* Keep a copy of switch_model for run_tests.py

* Add switch entry point

* Update for Switch 2.0.4 (Python 3 compatible)

* Removed recipe (windse) after converting into feedstock. [ci skip]

* Add recipes for http3 and requests-async

* Fix licenses

* Removed recipe (ifcopenshell) after converting into feedstock. [ci skip]

* Removed recipes (http3, requests-async) after converting into feedstocks. [ci skip]

* Removed recipe (switch_model) after converting into feedstock. [ci skip]

* testing updated version

Signed-off-by: Vanessa Sochat <[email protected]>

* Removed recipe (xrviz) after converting into feedstock. [ci skip]

* dont support windows

Signed-off-by: Vanessa Sochat <[email protected]>

* remove redundancy

Signed-off-by: Vanessa Sochat <[email protected]>

* add sat-stac

* remove pyaml and spython

Signed-off-by: Vanessa Sochat <[email protected]>

* Removed recipe (sat-stac) after converting into feedstock. [ci skip]

* add sat-search

* Update recipes/xontrib-readable-traceback/meta.yaml

Co-Authored-By: Chris Burr <[email protected]>

* Update recipes/xontrib-readable-traceback/meta.yaml

Co-Authored-By: Chris Burr <[email protected]>

* python-backtrace

* Removed recipe (sat-search) after converting into feedstock. [ci skip]

* Update recipes/singularity-compose/meta.yaml

Co-Authored-By: Chris Burr <[email protected]>

* Removed recipe (xontrib-readable-traceback) after converting into feedstock. [ci skip]

* Removed recipe (singularity-compose) after converting into feedstock. [ci skip]

* Render with rpm2cpio removed

This change has already landed in conda-build. So here we are just
regenerating the conda-build recipe with this change.

* Comment out tests to workaround a conda-build bug

* added perl-xml-parser req

* New meta.yaml for pyeviews

* Delete meta.yaml

* libxkbfile

* Removed recipe (libxkbfile) after converting into feedstock. [ci skip]

* added libxslt dep

* pkg-config

* Updated recipe maintainer

* Updated run requirements

* Add myself as a maintainer

* move deps to build env

* Removed recipe (xkeyboard-config) after converting into feedstock. [ci skip]

* Initial recipe for sphinx_gmt

Missing the SHA hash and doesn't list GMT as a dependency.

* Add hash of the pypi tarball

* Remove noarch

* Add recipe for httpcore

This is a little confusing, because the package has changed name to http3,
but older versions were released as httpcore.

httpcore is a dependency to build request-async:

conda-forge/requests-async-feedstock#2 (comment)

* Removed recipe (httpcore) after converting into feedstock. [ci skip]

* Add bespon and codebraid

* adding deid recipe

Signed-off-by: Vanessa Sochat <[email protected]>

* typo

Signed-off-by: Vanessa Sochat <[email protected]>

* Add codebraid python version restriction

* Removed recipes (codebraid, deid) after converting into feedstocks. [ci skip]

* Removed recipe (bespon) after converting into feedstock. [ci skip]

* Add LICENSE

* add recipe for pyreportjasper

* Added license file name on meta.yaml

* Require `openssl` version `1.1.1a`                 [skip ci]

Appears we are not able to install some dependencies without `openssl`
version `1.1.1a` installed (`1.1.1b` is already included). This tries
forcing `openssl` to version `1.1.1a` to fix the issue.

* Force `openssl` to `1.1.1a` first                  [ci skip]

* Drop `openssl` install hacks                       [ci skip]

These don't seem to work as expected either despite getting the
seemingly required version of `openssl` installed. So just go ahead and
drop them.

* Add noarch back

* Add recipe for pyjson5.

* Add entry point metadata.

* Removed recipes (libnl-cos6-x86_64, pyeviews, sphinx_gmt, vdom) after converting into feedstocks. [ci skip]

* MSAL package

* Rerender with latest conda-smithy

* PsychoPy 3.x, add entrypoint, remove lots of test imports

* Update recipe
  • Loading branch information
hoechenberger authored and kastman committed Jun 25, 2019
1 parent 9aa9aa2 commit 83cc3c3
Show file tree
Hide file tree
Showing 42 changed files with 1,041 additions and 348 deletions.
2 changes: 1 addition & 1 deletion .appveyor.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ install:
- cmd: appveyor-retry conda.exe update --yes --quiet conda


- cmd: appveyor-retry conda.exe install --yes --quiet conda-forge-pinning conda-forge-ci-setup=1.* networkx
- cmd: appveyor-retry conda.exe install --yes --quiet "conda!=4.6.1" conda-forge-pinning conda-forge-ci-setup=2.* networkx conda-build>=3.16

- cmd: appveyor-retry run_conda_forge_build_setup

Expand Down
20 changes: 20 additions & 0 deletions .azure-pipelines/azure-pipelines-linux.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
jobs:
- job: linux_64
pool:
vmImage: ubuntu-16.04
strategy:
maxParallel: 8
matrix:
linux:
CONFIG: azure-linux-64-comp7
CF_MAX_PY_VER: 37
AZURE: True
timeoutInMinutes: 240
steps:
- script: |
sudo pip install --upgrade pip
sudo pip install setuptools shyaml
displayName: Install dependencies
- script: .circleci/run_docker_build.sh
displayName: Run docker build
64 changes: 64 additions & 0 deletions .azure-pipelines/azure-pipelines-osx.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
jobs:
- job: osx_64
pool:
vmImage: macOS-10.13
strategy:
maxParallel: 8
matrix:
osx:
CONFIG: azure-osx-64-comp7
CF_MAX_PY_VER: 37
timeoutInMinutes: 240
steps:
# TODO: Fast finish on azure pipelines?
- script: |
echo "Fast Finish"
- script: |
echo "Removing homebrew from Azure to avoid conflicts."
curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/uninstall > ~/uninstall_homebrew
chmod +x ~/uninstall_homebrew
~/uninstall_homebrew -fq
rm ~/uninstall_homebrew
displayName: Remove homebrew
- bash: |
echo "##vso[task.prependpath]$CONDA/bin"
sudo chown -R $USER $CONDA
displayName: Add conda to PATH
- script: |
set -x -e
source activate base
conda install -n base -c conda-forge --quiet --yes conda-forge-ci-setup=2 conda-build shyaml networkx conda-forge-pinning
displayName: 'Add conda-forge-ci-setup=2'
- script: |
set -x -e
source activate base
echo "Configuring conda."
setup_conda_rc ./ ./recipes ./.ci_support/${CONFIG}.yaml
source run_conda_forge_build_setup
conda update --yes --quiet --all
env: {
OSX_FORCE_SDK_DOWNLOAD: "1"
}
displayName: Configure conda and conda-build
- script: |
# Find the recipes from master in this PR and remove them.
source activate base
echo ""
echo "Finding recipes merged in master and removing them from the build."
pushd ./recipes > /dev/null
git fetch --force origin master:master
git ls-tree --name-only master -- . | xargs -I {} sh -c "rm -rf {} && echo Removing recipe: {}"
popd > /dev/null
echo ""
# We just want to build all of the recipes.
python .ci_support/build_all.py ./recipes
86 changes: 86 additions & 0 deletions .azure-pipelines/azure-pipelines-win.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
jobs:
- job: win_64
pool:
vmImage: vs2017-win2016
strategy:
maxParallel: 4
matrix:
win:
CONFIG: azure-win-64
CF_MAX_PY_VER: 37
timeoutInMinutes: 240
steps:
# TODO: Fast finish on azure pipelines?
- script: |
echo "Fast Finish"
- script: |
choco install vcpython27 -fdv -y --debug
displayName: Install vcpython27.msi (if needed)
- powershell: |
Set-PSDebug -Trace 1
$batchcontent = @"
ECHO ON
SET vcpython=C:\Program Files (x86)\Common Files\Microsoft\Visual C++ for Python\9.0
DIR "%vcpython%"
CALL "%vcpython%\vcvarsall.bat" %*
"@
$batchDir = "C:\Program Files (x86)\Common Files\Microsoft\Visual C++ for Python\9.0\VC"
$batchPath = "$batchDir" + "\vcvarsall.bat"
New-Item -Path $batchPath -ItemType "file" -Force
Set-Content -Value $batchcontent -Path $batchPath
Get-ChildItem -Path $batchDir
Get-ChildItem -Path ($batchDir + '\..')
displayName: Patch vs2008 (if needed)
- task: CondaEnvironment@1
inputs:
packageSpecs: 'python=3.6 conda-build conda conda-forge::conda-forge-ci-setup=2 networkx conda-forge-pinning' # Optional
installOptions: "-c conda-forge"
updateConda: false
displayName: Install conda-build and activate environment

- script: set PYTHONUNBUFFERED=1

# Add our channels.
- script: conda.exe config --set show_channel_urls true
displayName: configure conda channels
- script: conda.exe config --remove channels defaults
displayName: configure conda channels
- script: conda.exe config --add channels defaults
displayName: configure conda channels

- script: conda.exe config --add channels conda-forge
displayName: configure conda channels


# Configure the VM.
- script: call run_conda_forge_build_setup
displayName: conda-forge build setup

# Find the recipes from master in this PR and remove them.
- script: |
git fetch --force origin master:master
cd recipes
for /f "tokens=*" %%a in ('git ls-tree --name-only master -- .') do rmdir /s /q %%a && echo Removing recipe: %%a
cd ..
# Special cased version setting some more things!
- script: |
git fetch --force origin master:master
python .ci_support\build_all.py recipes --arch 64
displayName: Build recipe (maybe vs2008)
env: {
VS90COMNTOOLS: "C:\\Program Files (x86)\\Common Files\\Microsoft\\Visual C++ for Python\\9.0\\VC\\bin",
}
10 changes: 10 additions & 0 deletions .ci_support/azure-linux-64-comp7.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
c_compiler:
- gcc
cxx_compiler:
- gxx
fortran_compiler:
- gfortran
channel_sources:
- conda-forge,defaults
docker_image:
- condaforge/linux-anvil-comp7
8 changes: 8 additions & 0 deletions .ci_support/azure-osx-64-comp7.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
c_compiler:
- clang
cxx_compiler:
- clangxx
fortran_compiler:
- gfortran
channel_sources:
- conda-forge,defaults
4 changes: 4 additions & 0 deletions .ci_support/azure-win-64.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
channel_sources:
- conda-forge,defaults
target_platform:
- win-64
72 changes: 60 additions & 12 deletions .ci_support/build_all.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,11 @@
from collections import OrderedDict
import sys

try:
from ruamel_yaml import safe_load, safe_dump
except ImportError:
from yaml import safe_load, safe_dump


def get_host_platform():
from sys import platform
Expand All @@ -20,21 +25,51 @@ def get_host_platform():

def build_all(recipes_dir, arch):
folders = os.listdir(recipes_dir)
old_comp_folders = []
new_comp_folders = []
if not folders:
print("Found no recipes to build")
return
channel_urls = ['local', 'conda-forge', 'defaults']

# ensure that noarch path exists and is indexed for newer conda (4.4+)
noarch_path = os.path.join(sys.exec_prefix, 'conda-bld', 'noarch')
try:
os.makedirs(noarch_path)
except:
pass
conda_build.api.update_index(noarch_path)
index = conda_build.conda_interface.get_index(channel_urls=channel_urls)
conda_resolve = conda_build.conda_interface.Resolve(index)

if get_host_platform() == "win":
new_comp_folders.extend(folders)
else:
for folder in folders:
built = False
cbc = os.path.join(recipes_dir, folder, "conda_build_config.yaml")
if os.path.exists(cbc):
with open(cbc, "r") as f:
text = ''.join(f.readlines())
if 'channel_sources' in text:
specific_config = safe_load(text)
if "channel_targets" not in specific_config:
raise RuntimeError("channel_targets not found in {}".format(folder))
if "channel_sources" in specific_config:
for row in specific_config["channel_sources"]:
channels = [c.strip() for c in row.split(",")]
if channels != ['conda-forge', 'defaults'] and \
channels != ['conda-forge/label/cf201901', 'defaults']:
print("Not a standard configuration of channel_sources. Building {} individually.".format(folder))
conda_build.api.build([os.path.join(recipes_dir, folder)], config=get_config(arch, channels))
built = True
break
if not built:
old_comp_folders.append(folder)
continue
new_comp_folders.append(folder)

if old_comp_folders:
print("Building {} with conda-forge/label/cf201901".format(','.join(old_comp_folders)))
channel_urls = ['local', 'conda-forge/label/cf201901', 'defaults']
build_folders(recipes_dir, old_comp_folders, arch, channel_urls)
if new_comp_folders:
print("Building {} with conda-forge/label/main".format(','.join(new_comp_folders)))
channel_urls = ['local', 'conda-forge', 'defaults']
build_folders(recipes_dir, new_comp_folders, arch, channel_urls)



def get_config(arch, channel_urls):
exclusive_config_file = os.path.join(conda_build.conda_interface.root_dir,
'conda_build_config.yaml')
platform = get_host_platform()
Expand All @@ -48,6 +83,18 @@ def build_all(recipes_dir, arch):
config = conda_build.api.Config(
variant_config_files=variant_config_files, arch=arch,
exclusive_config_file=exclusive_config_file, channel_urls=channel_urls)
return config

def build_folders(recipes_dir, folders, arch, channel_urls):

index_path = os.path.join(sys.exec_prefix, 'conda-bld')
os.makedirs(index_path, exist_ok=True)
conda_build.api.update_index(index_path)
index = conda_build.conda_interface.get_index(channel_urls=channel_urls)
conda_resolve = conda_build.conda_interface.Resolve(index)

config = get_config(arch, channel_urls)
platform = get_host_platform()

worker = {'platform': platform, 'arch': arch,
'label': '{}-{}'.format(platform, arch)}
Expand All @@ -70,7 +117,8 @@ def build_all(recipes_dir, arch):
for node in order:
d[G.node[node]['meta'].meta_path] = 1

conda_build.api.build(list(d.keys()), config=config)
for recipe in d.keys():
conda_build.api.build([recipe], config=get_config(arch, channel_urls))


if __name__ == "__main__":
Expand Down
31 changes: 21 additions & 10 deletions .ci_support/compute_build_graph.py
Original file line number Diff line number Diff line change
Expand Up @@ -74,9 +74,9 @@ def _git_changed_files(git_rev, stop_rev=None, git_root=''):
git_root = os.getcwd()
if stop_rev:
git_rev = "{0}..{1}".format(git_rev, stop_rev)
output = subprocess.check_output(['git', 'diff-tree', '--no-commit-id',
'--name-only', '-r', git_rev],
cwd=git_root)
print("Changed files from:", git_rev, stop_rev, git_root)
output = subprocess.check_output(['git', '-C', git_root, 'diff-tree',
'--no-commit-id', '--name-only', '-r', git_rev])
files = output.decode().splitlines()
return files

Expand Down Expand Up @@ -217,7 +217,8 @@ def add_recipe_to_graph(recipe_dir, graph, run, worker, conda_resolve,
recipes_dir=None, config=None, finalize=False):
try:
rendered = _get_or_render_metadata(recipe_dir, worker, config=config, finalize=finalize)
except (IOError, SystemExit):
except (IOError, SystemExit) as e:
log.exception('Exception raised!')
log.warn('invalid recipe dir: %s - skipping', recipe_dir)
return None

Expand All @@ -230,7 +231,7 @@ def add_recipe_to_graph(recipe_dir, graph, run, worker, conda_resolve,

if name not in graph.nodes():
graph.add_node(name, meta=metadata, worker=worker)
add_dependency_nodes_and_edges(name, graph, run, worker, conda_resolve,
add_dependency_nodes_and_edges(name, graph, run, worker, conda_resolve, config=config,
recipes_dir=recipes_dir, finalize=finalize)

# # add the test equivalent at the same time. This is so that expanding can find it.
Expand Down Expand Up @@ -285,9 +286,17 @@ def add_intradependencies(graph):
# what the build and host platforms are on the build machine.
# However, all we know right now is what machine we're actually
# on (the one calculating the graph).

test_requires = m.meta.get('test', {}).get('requires', [])

log.info("node: {}".format(node))
log.info(" build: {}".format(m.ms_depends('build')))
log.info(" host: {}".format(m.ms_depends('host')))
log.info(" run: {}".format(m.ms_depends('run')))
log.info(" test: {}".format(test_requires))

deps = set(m.ms_depends('build') + m.ms_depends('host') + m.ms_depends('run') +
[conda_interface.MatchSpec(dep) for dep in
m.meta.get('test', {}).get('requires', [])])
[conda_interface.MatchSpec(dep) for dep in test_requires or []])

for dep in deps:
name_matches = (n for n in graph.nodes() if graph.node[n]['meta'].name() == dep.name)
Expand Down Expand Up @@ -353,7 +362,9 @@ def collapse_subpackage_nodes(graph):
if subpackages:
remap_edges = [edge for edge in graph.edges() if edge[1] in subpackages]
for edge in remap_edges:
graph.add_edge(edge[0], master_key)
# make sure not to add references to yourself
if edge[0] != master_key:
graph.add_edge(edge[0], master_key)
graph.remove_edge(*edge)

# remove nodes that have been folded into master nodes
Expand Down Expand Up @@ -436,7 +447,7 @@ def _buildable(name, version, recipes_dir, worker, config, finalize):


def add_dependency_nodes_and_edges(node, graph, run, worker, conda_resolve, recipes_dir=None,
finalize=False):
finalize=False, config=None):
'''add build nodes for any upstream deps that are not yet installable
changes graph in place.
Expand All @@ -461,7 +472,7 @@ def add_dependency_nodes_and_edges(node, graph, run, worker, conda_resolve, reci
# " available) can't produce desired version ({})."
# .format(dep, version))
dep_name = add_recipe_to_graph(recipe_dir, graph, 'build', worker,
conda_resolve, recipes_dir, finalize=finalize)
conda_resolve, recipes_dir, config=config, finalize=finalize)
if not dep_name:
raise ValueError("Tried to build recipe {0} as dependency, which is skipped "
"in meta.yaml".format(recipe_dir))
Expand Down
2 changes: 0 additions & 2 deletions .ci_support/osx64.yaml

This file was deleted.

2 changes: 0 additions & 2 deletions .ci_support/win32.yaml

This file was deleted.

Loading

0 comments on commit 83cc3c3

Please sign in to comment.