Skip to content

Commit

Permalink
Merge upstream changes (#98)
Browse files Browse the repository at this point in the history
* [batch] Worker cleanup (hail-is#10155)

* [batch] Worker cleanup

* more changes

* wip

* delint

* additions?

* fix

* [query] Add `source_file_field` to `import_table` (hail-is#10164)

* [query] Add `source_file_field` to `import_table`

CHANGELOG: Add `source_file_field` parameter to `hl.import_table` to allow lines to be associated with their original source file.

* ugh

* [ci] add authorize sha and action items table to user page (hail-is#10142)

* [ci] add authorize sha and action items table to user page

* [ci] track review requested in addition to assigned for PR reviews

* [ci] add CI dropdown with link to user page (hail-is#10163)

* [batch] add more logs and do not wait for asyncgens (hail-is#10136)

* [batch] add more logs and do not wait for asyncgens

I think there is some unresolved issue with asyncgen shutdown that is keeping
workers alive. This is not an issue in worker because worker calls sys.exit
which forcibly stops execution. cc: @daniel-goldstein @jigold.

* fix lint

* [query-service] maybe fix event loop not initialized (hail-is#10153)

* [query-service] maybe fix event loop not initialized

The event loop is supposed to be initialized in the main thread. Sometimes
our tests get placed in the non-main thread (always a thread named Dummy-1).
Hopefully the session-scoped fixture is run in the main thread.

* fix

* [prometheus] add prometheus to track SLIs (hail-is#10165)

* [prometheus] add prometheus to track SLIs

* add wraps

* [query] apply nest-asyncio as early as possible (hail-is#10158)

* [query] apply nest-asyncio as early as possible

* fix

* [grafana] set pod fsGroup to grafana user (hail-is#10162)

* fix linting errors (hail-is#10171)

* [query] Remove verbose print (hail-is#10167)

Looks like this got added in some dndarray work

* [ci] update assignees and reviewers on PR github update (hail-is#10168)

* [query-service] fix receive logic (hail-is#10159)

* [query-service] fix receive logic

Only one coro waits on receive now. We still error if a message is sent before
we make our first response.

* fix

* fix

* CHANGELOG: Fixed incorrect error message when incorrect type specified with hl.loop (hail-is#10174)

* [linting] add curlylint check for any service that renders jinja2 (hail-is#10172)

* [linting] add curlylint check for any service that renders jinja2 templates

* [linting] spaces not tabs

* [website] fix website (hail-is#10173)

* [website] fix website

I build old versions of the docs and use them in new websites. This does not
work for versions of the docs before I introduced the new system. In particular
versions 0.2.63 and before generate old-style docs.

* tutorials are templated

* [ci] change mention for deploy failure (hail-is#10178)

* [gateway] move ukbb routing into gateway (hail-is#10179)

* [query] Fix filter intervals (keep=False) memory leak (hail-is#10182)

* [query-service] remove service backend tests (hail-is#10180)

They are too flaky currently due to the version issue.

* [website] pass response body as kwarg (hail-is#10176)

* Release 0.2.64 (hail-is#10183)

* Bump version number

* Updated changelog

* [nginx] ensure nginx configs dont overwrite each other in build.yaml (hail-is#10181)

* [query-service] teach query service to read MTs and Ts created by Spark (hail-is#10184)

* [query-service] teach query service to read MTs and Ts created by Spark

Hail-on-Spark uses HadoopFS which emulates directories by creating size-zero files with
the name `gs://bucket/dirname/`. Note: the object name literally ends in a slash. Such files
should not be included in `listStatus` (they should always be empty anyway). Unfortunately,
my fix in hail-is#9914 was wrong because `GoogleStorageFileStatus` removes
the trailing slash. This prevented the path from matching `path`, which always ends in a `/`.

* fix

* [website] dont jinja render any of the batch docs (hail-is#10190)

* [googlestoragefs] ignore the directory check entirely (hail-is#10185)

* [googlestoragefs] ignore the directory check entirely

If a file exists with the *same name as the directory we are listing*,
then it must be a directory marker. It does not matter if that file is
a directory or not.

* Update GoogleStorageFS.scala

* [ci] fix focus on slash and search job page for PRs (hail-is#10194)

* [query] Improve file compatibility error (hail-is#10191)

* Call init_service from init based on HAIL_QUERY_BACKEND value. (hail-is#10189)

* [query] NDArray Sum (hail-is#10187)

* Attempt implementing the sum rule in Emit

* Connected the python code, but not working yet

* NDArrayExpression.sum is working now

* Add default arg when no axis is provided

* More comprehensive test

* Unused imports

* Use sum appropriately in linear_regression_rows_nd

* Deleted extra blank line

* Don't use typeToTypeInfo, make NumericPrimitives the source of these decisions

* Better assertions, with tests

* Got the summation index correct

* Add documentation

* [website] fix resource path for non-html files in the docs (hail-is#10196)

* [query] Remove tcode from primitive orderings (hail-is#10193)

* [query] BlockMatrix map (hail-is#10195)

* Add map, but protect users of the spark backend from writing arbitrary maps

* If densify would have been a no-op, that should work

* Densify and Sparsify are no-ops for now

* Rename map to map_dense and map_sparse. Give better implementations for add, multiply, divide, subtract of a scalar

* Make the maps underscore methods

* [query] Remove all uses of .tcode[Boolean] (hail-is#10198)

* [ci] make test hello speak https (hail-is#10192)

* [tls] make hello use tls

* change pylint ignore message

* [query] blanczos_pca dont do extra loading work (hail-is#10201)

* Use the checkpointed table from mt_to_table_of_ndarray to avoid recomputing mt

* Keep extra row fields from being included

* Add query graceful shutdown for rolling updates (hail-is#10106)

* Merge pull request #35 from populationgenomics/add-query-graceful-shutdown

Add query graceful shutdown

* Remove unused argument from query:on_shutdown

* [auth] add more options for obtaining session id for dev credentials (hail-is#10203)

* [auth] add more options for obtaining session id for dev credentials

* [auth] extract userinfo query for use in both userinfo and verify_dev_credentials

* remove unused import

* [query] Default to Spark 3 (hail-is#10054)

* Change hail to use spark3 and scala 2.12 by default, change build_hail_spar3 to instead test spark2 for backwards support

* Update Makefile

* Update dataproc image version

* Scale down the dataproc version, since latest dataproc is using Spark release candidate

* Update pyspark version in requirements.txt

* Bump scala/spark patch versions

* We want to use the newer py4j jar when using spark 3

* Upgrade json4s

* I now want Spark 3.1.1, since it's been released

* Upgrade to 3.1.1 in the Makefile, fix a deprecateed IOUtils method

* Update pyspark as well

* Don't update json4s

* Try upgrading version

* Fixed issue for constructing bufferspecs

* Should at least be using newest one

* Remove abstracts from type hints

* Revert "Remove abstracts from type hints"

This reverts commit 1e0d194.

* Things don't go well if I don't use the same json4s version as Spark

* Mixed a typeHintFieldName

* See if this fixes my BlockMatrixSparsity issue

* json4s can't handle a curried apply method

* This works so long as the jar file is included in the libs directory

* Makefile changes to support pulling elasticsearch

* Use dataproc image for Spark 3.1.1

* Update patch version of dataproc image, no longer uses Spark RC

* Fixed up Makefile, now correctly depends on copying the jar

* Now we just check that the specified version is 7, as that's all we support

* Delete build_hail_spark2, we can't support spark2

* Version checks for Scala and Spark

* Updated installation docs

* Spark versions warning

* Update some old pysparks

* [batch] Add more info to UI pages (hail-is#10070)

* [batch] Add more info to UI pages

* fixes

* addr comment

* addr comments

* Bump jinja2 from 2.10.1 to 2.11.3 in /docker (hail-is#10209)

Bumps [jinja2](https://github.com/pallets/jinja) from 2.10.1 to 2.11.3.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/master/CHANGES.rst)
- [Commits](pallets/jinja@2.10.1...2.11.3)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* [docker][hail] update to latest pytest (hail-is#10177)

* [docker][hail] update to latest pytest

Issues like this https://ci.hail.is/batches/221291/jobs/112 do not appear locally for me,
I suspect this is due to my using a much newer pytest.

* fix many tests incorrectly using pytest

* another one

* remove unnecessary pip installs in service test dockerfiles

* fix

* [gateway] Cut out router and router-resolver from gateway internal routing (hail-is#10207)

* [gateway] cut out router-resolver from internal auth flow

* [gateway] cut out router from internal

* [datasets] add pan-ukb datasets (hail-is#10186)

* add available pan-ukb datasets

* add rst files for schemas

* reference associated variant indices HT in the block matrix descriptions

* [query] Add json warn context to `parse_json` (hail-is#10160)

We don't test the logs, but I did test this manually, it works as
expected.

* [query] fix tmp_dir default in init(), which doesn't work for the service backend (hail-is#10199)

* Fix tmp_dir default, which doesn't work for the service backend.

* Fix type for tmp_dir.

* [gitignore]ignore website and doc files (hail-is#10214)

* Remove duplicate on_shutdown in query service

Co-authored-by: jigold <jigold@users.noreply.github.com>
Co-authored-by: Tim Poterba <tpoterba@broadinstitute.org>
Co-authored-by: Daniel Goldstein <danielgold95@gmail.com>
Co-authored-by: Dan King <daniel.zidan.king@gmail.com>
Co-authored-by: John Compitello <johnc@broadinstitute.org>
Co-authored-by: Christopher Vittal <cvittal@broadinstitute.org>
Co-authored-by: Michael Franklin <michael@illusional.net>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Patrick Cummings <42842025+pwc2@users.noreply.github.com>
Co-authored-by: Carolin Diaz <63973811+CDiaz96@users.noreply.github.com>
  • Loading branch information
11 people authored Mar 23, 2021
1 parent 5298568 commit 317f9a3
Show file tree
Hide file tree
Showing 73 changed files with 975 additions and 211 deletions.
5 changes: 5 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -27,3 +27,8 @@ GTAGS
*.dylib
*/hail.jar
infra/.terraform.lock.hcl
hail/python/hail/docs/experimental/hail.experimental.DB.rst
hail/python/hailtop/batch/docs/api/
web_common/web_common/static/css/
website/docs.tar.gz
website/website/static/css/
47 changes: 29 additions & 18 deletions auth/auth/auth.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@
import google.auth.transport.requests
import google.oauth2.id_token
import google_auth_oauthlib.flow
from hailtop.auth import async_get_userinfo
from hailtop.config import get_deploy_config
from hailtop.tls import internal_server_ssl_context
from hailtop.hail_logging import AccessLogger
Expand Down Expand Up @@ -526,18 +525,7 @@ async def rest_logout(request, userdata):
return web.Response(status=200)


@routes.get('/api/v1alpha/userinfo')
async def userinfo(request):
if 'Authorization' not in request.headers:
log.info('Authorization not in request.headers')
raise web.HTTPUnauthorized()

auth_header = request.headers['Authorization']
session_id = maybe_parse_bearer_header(auth_header)
if not session_id:
log.info('Bearer not in Authorization header')
raise web.HTTPUnauthorized()

async def get_userinfo(request, session_id):
# b64 encoding of 32-byte session ID is 44 bytes
if len(session_id) != 44:
log.info('Session id != 44 bytes')
Expand All @@ -554,18 +542,41 @@ async def userinfo(request):
if len(users) != 1:
log.info(f'Unknown session id: {session_id}')
raise web.HTTPUnauthorized()
user = users[0]
return users[0]


@routes.get('/api/v1alpha/userinfo')
async def userinfo(request):
if 'Authorization' not in request.headers:
log.info('Authorization not in request.headers')
raise web.HTTPUnauthorized()

auth_header = request.headers['Authorization']
session_id = maybe_parse_bearer_header(auth_header)
if not session_id:
log.info('Bearer not in Authorization header')
raise web.HTTPUnauthorized()

return web.json_response(await get_userinfo(request, session_id))


return web.json_response(user)
async def get_session_id(request):
if 'X-Hail-Internal-Authorization' in request.headers:
return maybe_parse_bearer_header(request.headers['X-Hail-Internal-Authorization'])

if 'Authorization' in request.headers:
return maybe_parse_bearer_header(request.headers['Authorization'])

session = await aiohttp_session.get_session(request)
return session.get('session_id')


@routes.get('/api/v1alpha/verify_dev_credentials')
async def verify_dev_credentials(request):
session = await aiohttp_session.get_session(request)
session_id = session.get('session_id')
session_id = await get_session_id(request)
if not session_id:
raise web.HTTPUnauthorized()
userdata = await async_get_userinfo(session_id=session_id)
userdata = await get_userinfo(request, session_id)
is_developer = userdata is not None and userdata['is_developer'] == 1
if not is_developer:
raise web.HTTPUnauthorized()
Expand Down
2 changes: 1 addition & 1 deletion batch/Dockerfile.worker
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ RUN hail-apt-get-install \
COPY docker/hail-ubuntu/pip.conf /root/.config/pip/pip.conf
COPY docker/hail-ubuntu/hail-pip-install /bin/hail-pip-install
COPY docker/requirements.txt .
RUN hail-pip-install -r requirements.txt pyspark==2.4.0
RUN hail-pip-install -r requirements.txt pyspark==3.1.1

ENV SPARK_HOME /usr/local/lib/python3.7/site-packages/pyspark
ENV PATH "$PATH:$SPARK_HOME/sbin:$SPARK_HOME/bin"
Expand Down
3 changes: 3 additions & 0 deletions batch/batch/batch.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@ def _time_msecs_str(t):

d = {
'id': record['id'],
'user': record['user'],
'billing_project': record['billing_project'],
'token': record['token'],
'state': state,
Expand Down Expand Up @@ -85,6 +86,8 @@ def job_record_to_dict(record, name):
'batch_id': record['batch_id'],
'job_id': record['job_id'],
'name': name,
'user': record['user'],
'billing_project': record['billing_project'],
'state': record['state'],
'exit_code': exit_code,
'duration': duration
Expand Down
27 changes: 16 additions & 11 deletions batch/batch/front_end/front_end.py
Original file line number Diff line number Diff line change
Expand Up @@ -222,7 +222,8 @@ async def _query_batch_jobs(request, batch_id):
where_args.extend(args)

sql = f'''
SELECT jobs.*, batches.format_version, job_attributes.value AS name, SUM(`usage` * rate) AS cost
SELECT jobs.*, batches.user, batches.billing_project, batches.format_version,
job_attributes.value AS name, SUM(`usage` * rate) AS cost
FROM jobs
INNER JOIN batches ON jobs.batch_id = batches.id
LEFT JOIN job_attributes
Expand Down Expand Up @@ -1150,7 +1151,7 @@ async def _get_job(app, batch_id, job_id):
db: Database = app['db']

record = await db.select_and_fetchone('''
SELECT jobs.*, ip_address, format_version, SUM(`usage` * rate) AS cost
SELECT jobs.*, user, billing_project, ip_address, format_version, SUM(`usage` * rate) AS cost
FROM jobs
INNER JOIN batches
ON jobs.batch_id = batches.id
Expand Down Expand Up @@ -1252,28 +1253,31 @@ async def ui_get_job(request, userdata, batch_id):
app = request.app
job_id = int(request.match_info['job_id'])

job_status, attempts, job_log = await asyncio.gather(_get_job(app, batch_id, job_id),
_get_attempts(app, batch_id, job_id),
_get_job_log(app, batch_id, job_id))
job, attempts, job_log = await asyncio.gather(_get_job(app, batch_id, job_id),
_get_attempts(app, batch_id, job_id),
_get_job_log(app, batch_id, job_id))

job_status_status = job_status['status']
job['duration'] = humanize_timedelta_msecs(job['duration'])
job['cost'] = cost_str(job['cost'])

job_status = job['status']
container_status_spec = dictfix.NoneOr({
'name': str,
'timing': {'pulling': dictfix.NoneOr({'duration': dictfix.NoneOr(Number)}),
'running': dictfix.NoneOr({'duration': dictfix.NoneOr(Number)})},
'container_status': {'out_of_memory': False},
'state': str})
job_status_status_spec = {
job_status_spec = {
'container_statuses': {'input': container_status_spec,
'main': container_status_spec,
'output': container_status_spec}}
job_status_status = dictfix.dictfix(job_status_status, job_status_status_spec)
container_statuses = job_status_status['container_statuses']
job_status = dictfix.dictfix(job_status, job_status_spec)
container_statuses = job_status['container_statuses']
step_statuses = [container_statuses['input'],
container_statuses['main'],
container_statuses['output']]

job_specification = job_status['spec']
job_specification = job['spec']
if 'process' in job_specification:
process_specification = job_specification['process']
process_type = process_specification['type']
Expand All @@ -1289,11 +1293,12 @@ async def ui_get_job(request, userdata, batch_id):
page_context = {
'batch_id': batch_id,
'job_id': job_id,
'job': job,
'job_log': job_log,
'attempts': attempts,
'step_statuses': step_statuses,
'job_specification': job_specification,
'job_status_str': json.dumps(job_status, indent=2)
'job_status_str': json.dumps(job, indent=2)
}
return await render_template('batch', request, userdata, 'job.html', page_context)

Expand Down
23 changes: 22 additions & 1 deletion batch/batch/front_end/templates/batch.html
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,28 @@
<script src="{{ base_path }}/common_static/focus_on_keyup.js"></script>
{% endblock %}
{% block content %}

<h1>Batch {{ batch['id'] }}</h1>

<h2>Properties</h2>
<ul>
<li>User: {{ batch['user'] }}</li>
<li>Billing Project: {{ batch['billing_project'] }}</li>
<li>Time Created: {% if 'time_created' in batch and batch['time_created'] is not none %}{{ batch['time_created'] }}{% endif %}</li>
<li>Time Closed: {% if 'time_closed' in batch and batch['time_closed'] is not none %}{{ batch['time_closed'] }}{% endif %}</li>
<li>Time Completed: {% if 'time_completed' in batch and batch['time_completed'] is not none %}{{ batch['time_completed'] }}{% endif %}</li>
<li>Total Jobs: {{ batch['n_jobs'] }}</li>
<ul>
<li>Pending Jobs: {{ batch['n_jobs'] - batch['n_completed'] }}</li>
<li>Succeeded Jobs: {{ batch['n_succeeded'] }}</li>
<li>Failed Jobs: {{ batch['n_failed'] }}</li>
<li>Cancelled Jobs: {{ batch['n_cancelled'] }}</li>
</ul>
<li>Duration: {% if 'duration' in batch and batch['duration'] is not none %}{{ batch['duration'] }}{% endif %}</li>
<li>Cost: {% if 'cost' in batch and batch['cost'] is not none %}{{ batch['cost'] }}{% endif %}</li>
</ul>

<h2>Attributes</h2>
{% if 'attributes' in batch %}
{% for name, value in batch['attributes'].items() %}
<p>{{ name }}: {{ value }}</p>
Expand Down Expand Up @@ -64,7 +85,7 @@ <h2>Jobs</h2>
<tbody>
{% for job in batch['jobs'] %}
<tr>
<td class="numeric-cell">
<td class="numeric-cell" onClick="document.location.href='{{ base_path }}/batches/{{ job['batch_id'] }}/jobs/{{ job['job_id'] }}';">
<a href="{{ base_path }}/batches/{{ job['batch_id'] }}/jobs/{{ job['job_id'] }}">{{ job['job_id'] }}</a>
</td>
<td>
Expand Down
8 changes: 7 additions & 1 deletion batch/batch/front_end/templates/batches.html
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,8 @@ <h1>Batches</h1>
<thead>
<tr>
<th>ID</th>
<th>User</th>
<th>Billing Project</th>
<th>Name</th>
<th>Submitted</th>
<th>Completed</th>
Expand All @@ -68,7 +70,11 @@ <h1>Batches</h1>
<tbody>
{% for batch in batches %}
<tr>
<td class="numeric-cell"><a href="{{ base_path }}/batches/{{ batch['id'] }}">{{ batch['id'] }}</a></td>
<td class="numeric-cell" onClick="document.location.href='{{ base_path }}/batches/{{ batch['id'] }}';">
<a href="{{ base_path }}/batches/{{ batch['id'] }}">{{ batch['id'] }}</a>
</td>
<td>{{ batch['user'] }}</td>
<td>{{ batch['billing_project'] }}</td>
<td>
{% if 'attributes' in batch and 'name' in batch['attributes'] and batch['attributes']['name'] is not none %}
{{ batch['attributes']['name'] }}
Expand Down
12 changes: 12 additions & 0 deletions batch/batch/front_end/templates/job.html
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,18 @@
{% block content %}
<h1>Batch {{ batch_id }} Job {{ job_id }}</h1>

<h2>Properties</h2>
<ul>
<li><a href="{{ base_path }}/batches/{{ batch_id }}">Batch ID: {{ batch_id }}</a></li>
<li>Job ID: {{ job_id }}</li>
<li>User: {{ job['user'] }} </li>
<li>Billing Project: {{ job['billing_project'] }}</li>
<li>State: {{ job['state'] }}</li>
<li>Exit Code: {% if 'exit_code' in job and job['exit_code'] is not none %}{{ job['exit_code'] }}{% endif %}</li>
<li>Duration: {% if 'duration' in job and job['duration'] is not none %}{{ job['duration'] }}{% endif %}</li>
<li>Cost: {% if 'cost' in job and job['cost'] is not none %}{{ job['cost'] }}{% endif %}</li>
</ul>

<h2>Attempts</h2>
{% if attempts %}
<table class="data-table">
Expand Down
1 change: 1 addition & 0 deletions batch/test/test_dag.py
Original file line number Diff line number Diff line change
Expand Up @@ -156,6 +156,7 @@ def test():
callback_body.pop('duration')
assert (callback_body == {
'id': b.id,
'user': 'test',
'billing_project': 'test',
'token': token,
'state': 'success',
Expand Down
3 changes: 0 additions & 3 deletions benchmark-service/Dockerfile.test
Original file line number Diff line number Diff line change
@@ -1,6 +1,3 @@
FROM {{ service_base_image.image }}

COPY benchmark-service/test/ /test/
RUN python3 -m pip install --no-cache-dir \
pytest-instafail==0.4.1 \
pytest-asyncio==0.10.0
3 changes: 1 addition & 2 deletions benchmark-service/test/test_update_commits.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,14 +9,13 @@
from hailtop.httpx import client_session
import hailtop.utils as utils

pytestmark = pytest.mark.asyncio

logging.basicConfig(level=logging.INFO)
log = logging.getLogger(__name__)

sha = 'd626f793ad700c45a878d192652a0378818bbd8b'


@pytest.mark.asyncio
async def test_update_commits():
deploy_config = get_deploy_config()
headers = service_auth_headers(deploy_config, 'benchmark')
Expand Down
24 changes: 3 additions & 21 deletions build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -684,26 +684,6 @@ steps:
to: /cluster-tests.tar.gz
dependsOn:
- hail_build_image
- kind: runImage
name: build_hail_spark3
image:
valueFrom: hail_build_image.image
resources:
memory: "7.5G"
cpu: "4"
script: |
set -ex
cd /
rm -rf repo
mkdir repo
cd repo
{{ code.checkout_script }}
cd hail
time retry ./gradlew --version
export SPARK_VERSION="3.0.1" SCALA_VERSION="2.12.12"
time retry make jars python-version-info wheel
dependsOn:
- hail_build_image
- kind: buildImage
name: batch_worker_image
dockerFile: batch/Dockerfile.worker
Expand Down Expand Up @@ -2830,6 +2810,8 @@ steps:
mkdir -p ./ci/test ./hail/python
cp /repo/hail/ci/test/resources/build.yaml ./
cp -R /repo/hail/ci/test/resources ./ci/test/
cp /repo/hail/tls/Dockerfile ./ci/test/resources/Dockerfile.certs
cp /repo/hail/tls/create_certs.py ./ci/test/resources/
cp /repo/hail/pylintrc ./
cp /repo/hail/setup.cfg ./
cp -R /repo/hail/docker ./
Expand Down Expand Up @@ -3289,7 +3271,7 @@ steps:
script: |
set -ex
gcloud auth activate-service-account --key-file=/secrets/ci-deploy-0-1--hail-is-hail.json
SPARK_VERSION=2.4.5
SPARK_VERSION=3.1.1
BRANCH=0.2
SHA="{{ code.sha }}"
GS_JAR=gs://hail-common/builds/${BRANCH}/jars/hail-${BRANCH}-${SHA}-Spark-${SPARK_VERSION}.jar
Expand Down
1 change: 0 additions & 1 deletion ci/Dockerfile.test
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,3 @@ COPY hail/python/setup-hailtop.py /hailtop/setup.py
COPY hail/python/hailtop /hailtop/hailtop/
RUN hail-pip-install /hailtop && rm -rf /hailtop
COPY ci/test/ /test/
RUN hail-pip-install pytest-instafail==0.4.1 pytest-asyncio==0.10.0
30 changes: 30 additions & 0 deletions ci/test/resources/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,36 @@ steps:
publishAs: service-base
dependsOn:
- base_image
- kind: buildImage
name: create_certs_image
dockerFile: ci/test/resources/Dockerfile.certs
contextPath: ci/test/resources
publishAs: test_hello_create_certs_image
dependsOn:
- service_base_image
- kind: runImage
name: create_certs
image:
valueFrom: create_certs_image.image
script: |
set -ex
python3 create_certs.py \
{{ default_ns.name }} \
config.yaml \
/ssl-config-hail-root/hail-root-key.pem \
/ssl-config-hail-root/hail-root-cert.pem
serviceAccount:
name: admin
namespace:
valueFrom: default_ns.name
secrets:
- name: ssl-config-hail-root
namespace:
valueFrom: default_ns.name
mountPath: /ssl-config-hail-root
dependsOn:
- default_ns
- create_certs_image
- kind: buildImage
name: hello_image
dockerFile: ci/test/resources/Dockerfile
Expand Down
4 changes: 4 additions & 0 deletions ci/test/resources/config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
principals:
- name: hello
domain: hello
kind: json
Loading

0 comments on commit 317f9a3

Please sign in to comment.