|
1 | 1 | """ |
2 | | - SLISE - Sparse Linear Subset Explanations |
3 | | - ----------------------------------------- |
4 | | -
|
5 | | - The SLISE algorithm can be used for both robust regression and to explain outcomes from black box models. |
6 | | - See [slise.slise.regression][] and [slise.slise.explain][] for referense. |
7 | | -
|
8 | | -
|
9 | | - In robust regression we fit regression models that can handle data that |
10 | | - contains outliers. SLISE accomplishes this by fitting a model such that |
11 | | - the largest possible subset of the data items have an error less than a |
12 | | - given value. All items with an error larger than that are considered |
13 | | - potential outliers and do not affect the resulting model. |
14 | | -
|
15 | | - SLISE can also be used to provide local model-agnostic explanations for |
16 | | - outcomes from black box models. To do this we replace the ground truth |
17 | | - response vector with the predictions from the complex model. Furthermore, we |
18 | | - force the model to fit a selected item (making the explanation local). This |
19 | | - gives us a local approximation of the complex model with a simpler linear |
20 | | - model. In contrast to other methods SLISE creates explanations using real |
21 | | - data (not some discretised and randomly sampled data) so we can be sure that |
22 | | - all inputs are valid (i.e. in the correct data manifold, and follows the |
23 | | - constraints used to generate the data, e.g., the laws of physics). |
24 | | -
|
25 | | -
|
26 | | - More in-depth details about the algorithm can be found in the papers: |
27 | | -
|
28 | | - Björklund A., Henelius A., Oikarinen E., Kallonen K., Puolamäki K. |
29 | | - Sparse Robust Regression for Explaining Classifiers. |
30 | | - Discovery Science (DS 2019). |
31 | | - Lecture Notes in Computer Science, vol 11828, Springer. |
32 | | - https://doi.org/10.1007/978-3-030-33778-0_27 |
33 | | -
|
34 | | - Björklund A., Henelius A., Oikarinen E., Kallonen K., Puolamäki K. |
35 | | - Robust regression via error tolerance. |
36 | | - Data Mining and Knowledge Discovery (2022). |
37 | | - https://doi.org/10.1007/s10618-022-00819-2 |
38 | | -
|
| 2 | +SLISE - Sparse Linear Subset Explanations |
| 3 | +----------------------------------------- |
| 4 | +
|
| 5 | +The SLISE algorithm can be used for both robust regression and to explain outcomes from black box models. |
| 6 | +See [slise.slise.regression][] and [slise.slise.explain][] for referense. |
| 7 | +
|
| 8 | +
|
| 9 | +In robust regression we fit regression models that can handle data that |
| 10 | +contains outliers. SLISE accomplishes this by fitting a model such that |
| 11 | +the largest possible subset of the data items have an error less than a |
| 12 | +given value. All items with an error larger than that are considered |
| 13 | +potential outliers and do not affect the resulting model. |
| 14 | +
|
| 15 | +SLISE can also be used to provide local model-agnostic explanations for |
| 16 | +outcomes from black box models. To do this we replace the ground truth |
| 17 | +response vector with the predictions from the complex model. Furthermore, we |
| 18 | +force the model to fit a selected item (making the explanation local). This |
| 19 | +gives us a local approximation of the complex model with a simpler linear |
| 20 | +model. In contrast to other methods SLISE creates explanations using real |
| 21 | +data (not some discretised and randomly sampled data) so we can be sure that |
| 22 | +all inputs are valid (i.e. in the correct data manifold, and follows the |
| 23 | +constraints used to generate the data, e.g., the laws of physics). |
| 24 | +
|
| 25 | +
|
| 26 | +More in-depth details about the algorithm can be found in the papers: |
| 27 | +
|
| 28 | +Björklund A., Henelius A., Oikarinen E., Kallonen K., Puolamäki K. |
| 29 | +Sparse Robust Regression for Explaining Classifiers. |
| 30 | +Discovery Science (DS 2019). |
| 31 | +Lecture Notes in Computer Science, vol 11828, Springer. |
| 32 | +https://doi.org/10.1007/978-3-030-33778-0_27 |
| 33 | +
|
| 34 | +Björklund A., Henelius A., Oikarinen E., Kallonen K., Puolamäki K. |
| 35 | +Robust regression via error tolerance. |
| 36 | +Data Mining and Knowledge Discovery (2022). |
| 37 | +https://doi.org/10.1007/s10618-022-00819-2 |
| 38 | +
|
| 39 | +Björklund A., Henelius A., Oikarinen E., Kallonen K., Puolamäki K. |
| 40 | +Explaining any black box model using real data. |
| 41 | +Frontiers in Computer Science 5:1143904 (2023). |
| 42 | +https://doi.org/10.3389/fcomp.2023.1143904 |
39 | 43 | """ |
40 | 44 |
|
41 | | -from slise.slise import ( |
| 45 | +from slise.slise import ( # noqa: F401 |
42 | 46 | SliseRegression, |
43 | 47 | regression, |
44 | 48 | SliseExplainer, |
45 | 49 | explain, |
46 | 50 | SliseWarning, |
47 | 51 | ) |
48 | | -from slise.utils import limited_logit as logit |
49 | | -from slise.data import normalise_robust |
| 52 | +from slise.utils import limited_logit as logit # noqa: F401 |
| 53 | +from slise.data import normalise_robust # noqa: F401 |
0 commit comments