- Input Data and preprocessing: https://github.com/uboone/OpenSamples Use the hdf5 files, 'Inclusive, WithWire'
Use ubopendata conda environment for this part
The wire_table contains (Wire, Timetick, ADC) values. I used plane2, the collection plane
The input data is preprocessed to reduce the model size.
My naming scheme is AXB, where A and B are the division factors for Timetick and Wire respectively.
The Timetick dimension is downsampled by a factor of 10 before division
ex: 10X4 Original input size values for single event is (Wire, Timetick) = (3456, 6400) --> (3456,640) after downsampling --> (864,64) after division
hdf5 files are preprocessed into npy files through 01_data2npy_full.py. Samples are inside inputData
-
Training Teacher Use
02_train_teacher.pyUseubqkerasconda environment for step 2 to 4 -
Training Student Use
03_train_student.pyRequires Teacher from step 2.
Trained models are saved in savedModel
-
Evaluating Loss (or 'Anomaly Score') Use
04_getLoss.pyEvaluates loss (or Anomlay Score in our language) and saves into npy file -
Dependencies Use
conda_envsIf one has to install the environments from scratch, the.txtfolders have dependencies.
file.py, microboone_utils.py are obtained from https://github.com/uboone/OpenSamples
QDenseBatchnorm.py is in this branch of qkeras google/qkeras#74
- Models: Taken from https://github.com/Princeton-AD/cicada/blob/main/models.py
Defined in models.py
teacher_reshape and teacher_dense, teacher_reshape2 values need to be changed according with the prepocessed input image size