Skip to content

Commit 6b5f761

Browse files
sfc-gh-anavalosSnowflake Authors
andauthored
Project import generated by Copybara. (#171)
GitOrigin-RevId: f7b6c41ed6c8795b00e7b2ba235ebf0802b805a7 Co-authored-by: Snowflake Authors <[email protected]>
1 parent 097f9be commit 6b5f761

File tree

69 files changed

+3084
-835
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

69 files changed

+3084
-835
lines changed

CHANGELOG.md

Lines changed: 22 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,31 @@
11
# Release History
22

3-
## 1.10.0
3+
## 1.11.0
44

55
### Bug Fixes
66

7+
* ML Job: Fix `Error: Unable to retrieve head IP address` if not all instances start within the timeout.
8+
* ML Job: Fix `TypeError: SnowflakeCursor.execute() got an unexpected keyword argument '_force_qmark_paramstyle'`
9+
when running inside Stored Procedures.
10+
11+
### Behavior Changes
12+
13+
### New Features
14+
15+
* `ModelVersion.create_service()`: Made `image_repo` argument optional. By
16+
default it will use a default image repo, which is
17+
being rolled out in server version 9.22+.
18+
* Experiment Tracking (PrPr): Automatically log the model, metrics, and parameters while training Keras models with
19+
`snowflake.ml.experiment.callback.keras.SnowflakeKerasCallback`.
20+
21+
## 1.10.0
22+
723
### Behavior Changes
824

25+
* Experiment Tracking (PrPr): The import paths for the auto-logging callbacks have changed to
26+
`snowflake.ml.experiment.callback.xgboost.SnowflakeXgboostCallback` and
27+
`snowflake.ml.experiment.callback.lightgbm.SnowflakeLightgbmCallback`.
28+
929
### New Features
1030

1131
* Registry: add progress bars for `ModelVersion.create_service` and `ModelVersion.log_model`.
@@ -26,13 +46,13 @@
2646

2747
```python
2848
from snowflake.ml.experiment import ExperimentTracking
49+
from snowflake.ml.experiment.callback import SnowflakeXgboostCallback, SnowflakeLightgbmCallback
2950

3051
exp = ExperimentTracking(session=sp_session, database_name="ML", schema_name="PUBLIC")
3152

3253
exp.set_experiment("MY_EXPERIMENT")
3354

3455
# XGBoost
35-
from snowflake.ml.experiment.callback.xgboost import SnowflakeXgboostCallback
3656
callback = SnowflakeXgboostCallback(
3757
exp, log_model=True, log_metrics=True, log_params=True, model_name="model_name", model_signature=sig
3858
)
@@ -41,7 +61,6 @@ with exp.start_run():
4161
model.fit(X, y, eval_set=[(X_test, y_test)])
4262

4363
# LightGBM
44-
from snowflake.ml.experiment.callback.lightgbm import SnowflakeLightgbmCallback
4564
callback = SnowflakeLightgbmCallback(
4665
exp, log_model=True, log_metrics=True, log_params=True, model_name="model_name", model_signature=sig
4766
)

bazel/environments/conda-env-all.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ dependencies:
1212
- anyio==4.2.0
1313
- boto3==1.34.69
1414
- cachetools==5.3.3
15-
- catboost==1.2.0
15+
- catboost==1.2.8
1616
- cloudpickle==2.2.1
1717
- coverage==7.2.2
1818
- cryptography==41.0.3

bazel/environments/conda-env-ml.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ dependencies:
1212
- anyio==4.2.0
1313
- boto3==1.34.69
1414
- cachetools==5.3.3
15-
- catboost==1.2.0
15+
- catboost==1.2.8
1616
- cloudpickle==2.2.1
1717
- coverage==7.2.2
1818
- cryptography==41.0.3

bazel/environments/requirements_ml.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ anyio==4.2.0
88
boto3==1.34.69
99
build==0.10.0
1010
cachetools==5.3.3
11-
catboost==1.2.0
11+
catboost==1.2.8
1212
cloudpickle==2.2.1
1313
coverage==7.2.2
1414
cryptography==41.0.3

ci/RunBazelAction.sh

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
#!/bin/bash
22
# DESCRIPTION: Utility Shell script to run bazel action for snowml repository
33
#
4-
# RunBazelAction.sh <test|coverage> [-b <bazel_path>] [-m merge_gate|continuous_run|quarantined|local_unittest|local_all] [-t <target>] [-c <path_to_coverage_report>] [--tags <tags>]
4+
# RunBazelAction.sh <test|coverage> [-b <bazel_path>] [-m merge_gate|continuous_run|quarantined|local_unittest|local_all] [-t <target>] [-c <path_to_coverage_report>] [--tags <tags>] [--with-spcs-image]
55
#
66
# Args:
77
# action: bazel action, choose from test and coverage
@@ -18,6 +18,7 @@
1818
# -c: specify the path to the coverage report dat file.
1919
# -e: specify the environment, used to determine.
2020
# --tags: specify bazel test tag filters (e.g., "feature:jobs,feature:data")
21+
# --with-spcs-image: use spcs image for testing.
2122
#
2223

2324
set -o pipefail
@@ -40,6 +41,7 @@ help() {
4041
echo ""
4142
echo "Options:"
4243
echo " --tags <tags> Specify bazel tag filters (comma-separated)"
44+
echo " --with-spcs-image Use spcs image for testing."
4345
echo ""
4446
echo "Examples:"
4547
echo " ${PROG} test --tags 'feature:jobs'"
@@ -109,7 +111,7 @@ fi
109111
action_env=()
110112

111113
if [[ "${WITH_SPCS_IMAGE}" = true ]]; then
112-
export SKIP_GRYPE=true
114+
export RUN_GRYPE=false
113115
source model_container_services_deployment/ci/build_and_push_images.sh
114116
action_env=("--action_env=BUILDER_IMAGE_PATH=${BUILDER_IMAGE_PATH}" "--action_env=BASE_CPU_IMAGE_PATH=${BASE_CPU_IMAGE_PATH}" "--action_env=BASE_GPU_IMAGE_PATH=${BASE_GPU_IMAGE_PATH}" "--action_env=IMAGE_BUILD_SIDECAR_CPU_PATH=${IMAGE_BUILD_SIDECAR_CPU_PATH}" "--action_env=IMAGE_BUILD_SIDECAR_GPU_PATH=${IMAGE_BUILD_SIDECAR_GPU_PATH}" "--action_env=PROXY_IMAGE_PATH=${PROXY_IMAGE_PATH}" "--action_env=VLLM_IMAGE_PATH=${VLLM_IMAGE_PATH}")
115117
fi

ci/build_and_run_tests.sh

Lines changed: 45 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
#!/bin/bash
22

33
# Usage
4-
# build_and_run_tests.sh <workspace> [-b <bazel path>] [--env pip|conda] [--mode merge_gate|continuous_run] [--with-snowpark] [--with-spcs-image] [--report <report_path>]
4+
# build_and_run_tests.sh <workspace> [-b <bazel path>] [--env pip|conda] [--mode merge_gate|continuous_run] [--with-snowpark] [--with-spcs-image] [--run-grype] [--report <report_path>]
55
#
66
# Args
77
# workspace: path to the workspace, SnowML code should be in snowml directory.
@@ -15,6 +15,7 @@
1515
# quarantined: run all quarantined tests.
1616
# with-snowpark: Build and test with snowpark in snowpark-python directory in the workspace.
1717
# with-spcs-image: Build and test with spcs-image in spcs-image directory in the workspace.
18+
# run-grype: Run grype security scanning on SPCS images. Only valid with --with-spcs-image.
1819
# snowflake-env: The environment of the snowflake, use to determine the test quarantine list
1920
# report: Path to xml test report
2021
#
@@ -30,7 +31,7 @@ PROG=$0
3031

3132
help() {
3233
local exit_code=$1
33-
echo "Usage: ${PROG} <workspace> [-b <bazel path>] [--env pip|conda] [--mode merge_gate|continuous_run|quarantined] [--with-snowpark] [--with-spcs-image] [--snowflake-env <sf_env>] [--report <report_path>]"
34+
echo "Usage: ${PROG} <workspace> [-b <bazel path>] [--env pip|conda] [--mode merge_gate|continuous_run|quarantined] [--with-snowpark] [--with-spcs-image] [--run-grype] [--snowflake-env <sf_env>] [--report <report_path>]"
3435
exit "${exit_code}"
3536
}
3637

@@ -39,6 +40,7 @@ BAZEL="bazel"
3940
ENV="pip"
4041
WITH_SNOWPARK=false
4142
WITH_SPCS_IMAGE=false
43+
RUN_GRYPE=false
4244
MODE="continuous_run"
4345
PYTHON_VERSION=3.9
4446
PYTHON_ENABLE_SCRIPT="bin/activate"
@@ -91,6 +93,9 @@ while (($#)); do
9193
--with-spcs-image)
9294
WITH_SPCS_IMAGE=true
9395
;;
96+
--run-grype)
97+
RUN_GRYPE=true
98+
;;
9499
-h | --help)
95100
help 0
96101
;;
@@ -101,6 +106,12 @@ while (($#)); do
101106
shift
102107
done
103108

109+
# Validate flag combinations
110+
if [ "${RUN_GRYPE}" = true ] && [ "${WITH_SPCS_IMAGE}" = false ]; then
111+
echo "Error: --run-grype flag requires --with-spcs-image to be set"
112+
help 1
113+
fi
114+
104115
echo "Running build_and_run_tests with PYTHON_VERSION ${PYTHON_VERSION}"
105116

106117
EXT=""
@@ -180,6 +191,25 @@ trap 'rm -rf "${TEMP_BIN}"' EXIT
180191
# Install micromamba
181192
_MICROMAMBA_BIN="micromamba${EXT}"
182193
if [ "${ENV}" = "conda" ]; then
194+
CONDA="/mnt/jenkins/home/jenkins/miniforge3/condabin/conda"
195+
196+
# Check if miniforge is already installed
197+
if [ -x "${CONDA}" ]; then
198+
echo "Miniforge exists at ${CONDA}."
199+
else
200+
echo "Downloading miniforge ..."
201+
curl -L -O "https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-$(uname)-$(uname -m).sh"
202+
203+
echo "Installing miniforge ..."
204+
/bin/bash "Miniforge3-$(uname)-$(uname -m).sh" -b -u
205+
fi
206+
207+
echo "Using ${CONDA} ..."
208+
209+
echo "Installing conda-build ..."
210+
${CONDA} install conda-build --yes
211+
212+
echo "Installing micromamba ..."
183213
if ! command -v "${_MICROMAMBA_BIN}" &>/dev/null; then
184214
curl -Lsv "https://github.com/mamba-org/micromamba-releases/releases/latest/download/micromamba-${MICROMAMBA_PLATFORM}-${MICROMAMBA_ARCH}" -o "${TEMP_BIN}/micromamba${EXT}" && chmod +x "${TEMP_BIN}/micromamba${EXT}"
185215
_MICROMAMBA_BIN="${TEMP_BIN}/micromamba${EXT}"
@@ -264,30 +294,31 @@ if [ "${ENV}" = "pip" ]; then
264294
cp "$("${BAZEL}" "${BAZEL_ADDITIONAL_STARTUP_FLAGS[@]+"${BAZEL_ADDITIONAL_STARTUP_FLAGS[@]}"}" info bazel-bin)/dist/snowflake_ml_python-${VERSION}-py3-none-any.whl" "${WORKSPACE}"
265295
popd
266296
else
267-
# Clean conda cache
268-
conda clean --all --force-pkgs-dirs -y
297+
echo "Cleaning conda cache ..."
298+
${CONDA} clean --all --force-pkgs-dirs -y
269299

270-
# Clean conda build workspace
300+
echo "Cleaning conda build workspace ..."
271301
rm -rf "${WORKSPACE}/conda-bld"
272302

273-
# Build Snowpark
303+
echo "Building snowpark-python conda package ..."
274304
if [ "${WITH_SNOWPARK}" = true ]; then
275305
pushd ${SNOWPARK_DIR}
276-
conda build recipe/ --python=${PYTHON_VERSION} --numpy=1.16 --croot "${WORKSPACE}/conda-bld"
306+
${CONDA} build recipe/ --python=${PYTHON_VERSION} --numpy=1.16 --croot "${WORKSPACE}/conda-bld"
277307
popd
278308
fi
279309

280-
# Build SnowML
281310
pushd ${SNOWML_DIR}
282-
# Build conda package
283-
conda build -c conda-forge --override-channels --prefix-length 50 --python=${PYTHON_VERSION} --croot "${WORKSPACE}/conda-bld" ci/conda_recipe
284-
conda build purge
311+
312+
echo "Building snowflake-ml-python conda package ..."
313+
${CONDA} build -c conda-forge --override-channels --prefix-length 50 --python=${PYTHON_VERSION} --croot "${WORKSPACE}/conda-bld" ci/conda_recipe
314+
${CONDA} build purge
285315
popd
286316
fi
287317

288318
if [[ "${WITH_SPCS_IMAGE}" = true ]]; then
289319
pushd ${SNOWML_DIR}
290-
# Build SPCS Image
320+
echo "Building SPCS Image ..."
321+
export RUN_GRYPE
291322
source model_container_services_deployment/ci/build_and_push_images.sh
292323
popd
293324
fi
@@ -361,7 +392,7 @@ for i in "${!groups[@]}"; do
361392
COMMON_PYTEST_FLAG+=(-m "not conda_incompatible")
362393
fi
363394
# Create local conda channel
364-
conda index "${WORKSPACE}/conda-bld"
395+
${CONDA} index "${WORKSPACE}/conda-bld"
365396

366397
# Clean conda cache
367398
"${_MICROMAMBA_BIN}" clean --all --force-pkgs-dirs -y
@@ -384,7 +415,7 @@ for i in "${!groups[@]}"; do
384415

385416
# Run integration tests
386417
set +e
387-
TEST_SRCDIR="${TEMP_TEST_DIR}" conda run -p ./testenv --no-capture-output python -m pytest "${COMMON_PYTEST_FLAG[@]}" tests/integ/
418+
TEST_SRCDIR="${TEMP_TEST_DIR}" ${CONDA} run -p ./testenv --no-capture-output python -m pytest "${COMMON_PYTEST_FLAG[@]}" tests/integ/
388419
group_exit_codes[$i]=$?
389420
set -e
390421

ci/conda_recipe/meta.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ build:
1717
noarch: python
1818
package:
1919
name: snowflake-ml-python
20-
version: 1.10.0
20+
version: 1.11.0
2121
requirements:
2222
build:
2323
- python

ci/targets/quarantine/prod3.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
//tests/integ/snowflake/ml/extra_tests:xgboost_external_memory_training_test
22
//tests/integ/snowflake/ml/extra_tests:pipeline_with_ohe_and_xgbr_test
3-
//tests/integ/snowflake/ml/lineage:lineage_integ_test
43
//tests/integ/snowflake/ml/modeling/manifold:spectral_embedding_test
54
//tests/integ/snowflake/ml/modeling/linear_model:logistic_regression_test
65
//tests/integ/snowflake/ml/registry/services:registry_huggingface_pipeline_model_deployment_test
76
//tests/integ/snowflake/ml/registry/services:registry_sentence_transformers_model_deployment_test
7+
//tests/integ/snowflake/ml/jobs:jobs_integ_test

requirements.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -84,7 +84,7 @@
8484
- name: boto3
8585
dev_version: 1.34.69
8686
- name: catboost
87-
dev_version: 1.2.0
87+
dev_version: 1.2.8
8888
version_requirements: '>=1.2.0, <2'
8989
requirements_extra_tags:
9090
- catboost

snowflake/ml/experiment/callback/BUILD.bazel

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,29 @@ py_library(
1111
],
1212
)
1313

14+
py_library(
15+
name = "keras",
16+
srcs = ["keras.py"],
17+
deps = [
18+
"//snowflake/ml/experiment:experiment_tracking",
19+
"//snowflake/ml/experiment:utils",
20+
"//snowflake/ml/model:model_signature",
21+
],
22+
)
23+
24+
py_test(
25+
name = "keras_test",
26+
srcs = ["test/keras_test.py"],
27+
optional_dependencies = [
28+
"keras",
29+
],
30+
tags = ["feature:observability"],
31+
deps = [
32+
":keras",
33+
":test_base",
34+
],
35+
)
36+
1437
py_library(
1538
name = "lightgbm",
1639
srcs = ["lightgbm.py"],
@@ -56,6 +79,7 @@ py_test(
5679
py_library(
5780
name = "callback",
5881
deps = [
82+
":keras",
5983
":lightgbm",
6084
":xgboost",
6185
],

0 commit comments

Comments
 (0)