Skip to content

Update readme for optimizer and rewriter tools

Sign in for the full log view
GitHub Actions / Test Results failed Jun 24, 2024 in 0s

77 fail, 2 537 skipped, 5 633 pass in 1h 7m 4s

     30 files      30 suites   1h 7m 4s ⏱️
  8 247 tests  5 633 ✅   2 537 💤  77 ❌
160 098 runs  55 746 ✅ 104 243 💤 109 ❌

Results for commit c2c08b6.

Annotations

Check warning on line 0 in onnxscript.tools.transformers_models.phi3_test.TestExportPhi3

See this annotation in the file changed.

@github-actions github-actions / Test Results

4 out of 24 runs failed: test_phi3_export_cpu (onnxscript.tools.transformers_models.phi3_test.TestExportPhi3)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 5s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 7s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 7s]
Raw output
ValueError: Undefined variable model_layers_0_resid_attn_dropout_1.
Available variables: SubScope 0:
  Function transformers_models_phi3_modeling_phi3_Phi3DecoderLayer_model_layers_0_1:
    ir.Values:
      embedding: StaticValueInfo(embedding, shape:[13, 7, 32], dtype:1, no const value.)
      view: StaticValueInfo(view, shape:[1, 7], dtype:7, no const value.)
      masked_fill_2: StaticValueInfo(masked_fill_2, shape:[13, 1, 7, 7], dtype:1, no const value.)
      model.layers.0.input_layernorm.weight: StaticValueInfo(model.layers.0.input_layernorm.weight, shape:[32], dtype:1, has const value.)
      model.layers.0.self_attn.qkv_proj.weight: StaticValueInfo(model.layers.0.self_attn.qkv_proj.weight, shape:[64, 32], dtype:1, no const value.)
      model.layers.0.self_attn.rotary_emb.inv_freq: StaticValueInfo(model.layers.0.self_attn.rotary_emb.inv_freq, shape:[4], dtype:1, has const value.)
      model.layers.0.self_attn.o_proj.weight: StaticValueInfo(model.layers.0.self_attn.o_proj.weight, shape:[32, 32], dtype:1, no const value.)
      model.layers.0.post_attention_layernorm.weight: StaticValueInfo(model.layers.0.post_attention_layernorm.weight, shape:[32], dtype:1, has const value.)
      model.layers.0.mlp.gate_up_proj.weight: StaticValueInfo(model.layers.0.mlp.gate_up_proj.weight, shape:[32, 32], dtype:1, has const value.)
      model.layers.0.mlp.down_proj.weight: StaticValueInfo(model.layers.0.mlp.down_proj.weight, shape:[32, 16], dtype:1, has const value.)
      model_layers_0_input_layernorm_1: StaticValueInfo(model_layers_0_input_layernorm_1, shape:[13, 7, 32], dtype:1, no const value.)
      model_layers_0_self_attn_1: StaticValueInfo(model_layers_0_self_attn_1, shape:[13, 2, 7, 8], dtype:1, no const value.)
      model_layers_0_self_attn_1_1: StaticValueInfo(model_layers_0_self_attn_1_1, shape:[13, 2, 7, 8], dtype:1, no const value.)
      model_layers_0_self_attn_1_2: StaticValueInfo(model_layers_0_self_attn_1_2, shape:[13, 7, 32], dtype:1, no const value.)
      add_5: StaticValueInfo(add_5, shape:[13, 7, 32], dtype:1, no const value.)
      add_7: StaticValueInfo(add_7, shape:[13, 7, 32], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_model_layers_0_self_attn_qkv_proj_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_model_layers_0_self_attn_qkv_proj_1, shape:[13, 7, 64], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_8: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_8, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_12: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_12, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_16: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_16, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_20: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_20, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_7: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_7, shape:[13, 7, 32], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_25: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_25, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_29: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_29, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_33: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_33, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_37: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_37, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_8: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_8, shape:[13, 7, 16], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_42: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_42, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_46: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_46, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_50: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_50, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_54: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_54, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_9: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_9, shape:[13, 7, 16], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_57: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_57, shape:[4], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_4: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_4, shape:[13, 7, 4, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_transpose: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_transpose, shape:[13, 4, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_61: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_61, shape:[4], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_5: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_5, shape:[13, 7, 2, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_transpose_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_transpose_1, shape:[13, 2, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_65: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_65, shape:[4], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_6: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_6, shape:[13, 7, 2, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_10: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_10, shape:[1, 1, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_11: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_11, shape:[1, 1, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_2: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_2, shape:[13, 4, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_77: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_77, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_81: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_81, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_85: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_85, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_89: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_89, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_13: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_13, shape:[13, 4, 7, 4], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_94: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_94, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_98: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_98, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_102: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_102, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_106: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_106, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_14: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_14, shape:[13, 4, 7, 4], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_neg: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_neg, shape:[13, 4, 7, 4], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_cat_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_cat_1, shape:[13, 4, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_3: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_3, shape:[13, 4, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_add_2: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_add_2, shape:[13, 4, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_4: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_4, shape:[13, 2, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_116: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_116, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_120: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_120, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_124: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_124, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_128: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_128, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_15: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_15, shape:[13, 2, 7, 4], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_133: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_133, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_137: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_137, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_141: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_141, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_145: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_145, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_16: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_16, shape:[13, 2, 7, 4], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_neg_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_neg_1, shape:[13, 2, 7, 4], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_cat_2: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_cat_2, shape:[13, 2, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_5: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_5, shape:[13, 2, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_154: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_154, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_158: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_158, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_162: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_162, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_166: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_166, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_17: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_17, shape:[13, 2, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_171: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_171, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_175: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_175, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_179: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_179, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_183: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_183, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_18: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_18, shape:[13, 2, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_unsqueeze_258_dim_0: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_unsqueeze_258_dim_0, shape:[], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_12: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_12, shape:[13, 2, 1, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_189: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_189, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_193: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_193, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_197: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_197, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_201: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_201, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_19: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_19, shape:[13, 2, 1, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_206: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_206, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_210: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_210, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_214: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_214, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_218: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_218, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_20: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_20, shape:[13, 2, 1, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_220: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_220, shape:[5], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_294_size_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_294_size_1, shape:[5], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone, shape:[13, 2, 2, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_223: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_223, shape:[4], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__unsafe_view: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__unsafe_view, shape:[13, 4, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_228: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_228, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_232: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_232, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_236: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_236, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_240: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_240, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_21: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_21, shape:[13, 2, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_245: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_245, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_249: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_249, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_253: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_253, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_257: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_257, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_22: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_22, shape:[13, 2, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_unsqueeze_332_dim_0: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_unsqueeze_332_dim_0, shape:[], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_13: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_13, shape:[13, 2, 1, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_263: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_263, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_267: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_267, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_271: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_271, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_275: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_275, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_23: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_23, shape:[13, 2, 1, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_280: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_280, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_284: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_284, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_288: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_288, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_292: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_292, shape:[1], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_24: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_24, shape:[13, 2, 1, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_294: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_294, shape:[5], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_368_size_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_368_size_1, shape:[5], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone_1, shape:[13, 2, 2, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_297: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_297, shape:[4], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__unsafe_view_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__unsafe_view_1, shape:[13, 4, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_transpose_4: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_transpose_4, shape:[13, 4, 8, 7], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_300: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_300, shape:[4], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_374_size_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_374_size_1, shape:[4], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone_2: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone_2, shape:[13, 4, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_303: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_303, shape:[3], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__unsafe_view_2: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__unsafe_view_2, shape:[52, 7, 8], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_305: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_305, shape:[4], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_379_size_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_379_size_1, shape:[4], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_expand_9: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_expand_9, shape:[13, 4, 8, 7], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_308: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_308, shape:[3], dtype:7, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_10: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_10, shape:[52, 8, 7], dtype:1, no const value.)
      transformers_models_phi3_modeling_phi3_Phi3Attention_mode…attn_1_1__val_57: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_57, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_4: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_4, shape:[13, 7, 4, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_transpose: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_transpose, shape:[13, 4, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_61: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_61, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_5: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_5, shape:[13, 7, 2, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_transpose_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_transpose_1, shape:[13, 2, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_65: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_65, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_6: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_6, shape:[13, 7, 2, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_10: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_10, shape:[1, 1, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_11: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_11, shape:[1, 1, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_2: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_2, shape:[13, 4, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_77: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_77, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_81: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_81, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_85: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_85, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_89: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_89, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_13: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_13, shape:[13, 4, 7, 4], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_94: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_94, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_98: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_98, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_102: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_102, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_106: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_106, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_14: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_14, shape:[13, 4, 7, 4], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_neg: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_neg, shape:[13, 4, 7, 4], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_cat_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_cat_1, shape:[13, 4, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_3: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_3, shape:[13, 4, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_add_2: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_add_2, shape:[13, 4, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_4: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_4, shape:[13, 2, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_116: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_116, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_120: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_120, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_124: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_124, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_128: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_128, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_15: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_15, shape:[13, 2, 7, 4], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_133: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_133, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_137: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_137, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_141: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_141, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_145: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_145, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_16: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_16, shape:[13, 2, 7, 4], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_neg_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_neg_1, shape:[13, 2, 7, 4], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_cat_2: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_cat_2, shape:[13, 2, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_5: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_mul_5, shape:[13, 2, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_154: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_154, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_158: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_158, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_162: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_162, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_166: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_166, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_17: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_17, shape:[13, 2, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_171: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_171, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_175: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_175, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_179: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_179, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_183: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_183, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_18: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_18, shape:[13, 2, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_unsqueeze_258_dim_0: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_unsqueeze_258_dim_0, shape:[], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_12: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_12, shape:[13, 2, 1, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_189: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_189, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_193: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_193, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_197: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_197, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_201: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_201, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_19: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_19, shape:[13, 2, 1, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_206: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_206, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_210: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_210, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_214: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_214, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_218: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_218, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_20: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_20, shape:[13, 2, 1, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_220: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_220, shape:[5], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_294_size_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_294_size_1, shape:[5], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone, shape:[13, 2, 2, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_223: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_223, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__unsafe_view: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__unsafe_view, shape:[13, 4, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_228: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_228, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_232: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_232, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_236: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_236, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_240: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_240, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_21: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_21, shape:[13, 2, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_245: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_245, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_249: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_249, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_253: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_253, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_257: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_257, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_22: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_22, shape:[13, 2, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_unsqueeze_332_dim_0: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_unsqueeze_332_dim_0, shape:[], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_13: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_unsqueeze_13, shape:[13, 2, 1, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_263: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_263, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_267: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_267, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_271: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_271, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_275: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_275, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_23: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_23, shape:[13, 2, 1, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_280: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_280, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_284: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_284, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_288: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_288, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_292: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_292, shape:[1], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_24: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_slice_24, shape:[13, 2, 1, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_294: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_294, shape:[5], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_368_size_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_368_size_1, shape:[5], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone_1, shape:[13, 2, 2, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_297: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_297, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__unsafe_view_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__unsafe_view_1, shape:[13, 4, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_transpose_4: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_transpose_4, shape:[13, 4, 8, 7], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_300: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_300, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_374_size_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_374_size_1, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone_2: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone_2, shape:[13, 4, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_303: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_303, shape:[3], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__unsafe_view_2: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__unsafe_view_2, shape:[52, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_305: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_305, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_379_size_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_379_size_1, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_expand_9: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_expand_9, shape:[13, 4, 8, 7], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_308: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_308, shape:[3], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_10: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_10, shape:[52, 8, 7], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_bmm_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_bmm_1, shape:[52, 7, 7], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_312: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_312, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_11: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_11, shape:[13, 4, 7, 7], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_314: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_314, shape:[], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_div: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_div, shape:[13, 4, 7, 7], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_add_4: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_add_4, shape:[13, 4, 7, 7], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone_3: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone_3, shape:[13, 4, 7, 7], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_318: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_318, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_392_size_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_392_size_1, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_expand_10: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_expand_10, shape:[13, 4, 7, 7], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_321: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_321, shape:[3], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_12: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_12, shape:[52, 7, 7], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_323: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_323, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_397_size_1: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_aten_expand_397_size_1, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_expand_11: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_expand_11, shape:[13, 4, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_326: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_326, shape:[3], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_13: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_13, shape:[52, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_bmm_2: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_bmm_2, shape:[52, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_330: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_330, shape:[4], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_14: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_14, shape:[13, 4, 7, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone_4: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_clone_4, shape:[13, 7, 4, 8], dtype:1, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_334: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1__val_334, shape:[3], dtype:7, no const value.)
E         transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_15: StaticValueInfo(transformers_models_phi3_modeling_phi3_Phi3Attention_model_layers_0_self_attn_1_1_view_15, shape:[13, 7, 32], dtype:1, no const value.)
E       RefAttributes:

Check warning on line 0 in onnxscript.tools.transformers_models.phi_test.TestExportPhi

See this annotation in the file changed.

@github-actions github-actions / Test Results

4 out of 24 runs failed: test_phi_export_cpu (onnxscript.tools.transformers_models.phi_test.TestExportPhi)

artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 5s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 8s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 3s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 5s]
Raw output
ValueError: Undefined variable model_embed_dropout_1.
Available variables: SubScope 0:
  Function transformers_models_phi_modeling_phi_PhiModel_model_1:
    ir.Values:
      l_input_ids_: StaticValueInfo(l_input_ids_, shape:[13, 7], dtype:7, no const value.)
      l_attention_mask_: StaticValueInfo(l_attention_mask_, shape:[13, 7], dtype:1, no const value.)
      model.embed_tokens.weight: StaticValueInfo(model.embed_tokens.weight, shape:[99, 32], dtype:1, has const value.)
      model.layers.0.input_layernorm.weight: StaticValueInfo(model.layers.0.input_layernorm.weight, shape:[32], dtype:1, has const value.)
      model.layers.0.input_layernorm.bias: StaticValueInfo(model.layers.0.input_layernorm.bias, shape:[32], dtype:1, has const value.)
      model.layers.0.self_attn.q_proj.weight: StaticValueInfo(model.layers.0.self_attn.q_proj.weight, shape:[32, 32], dtype:1, has const value.)
      model.layers.0.self_attn.q_proj.bias: StaticValueInfo(model.layers.0.self_attn.q_proj.bias, shape:[32], dtype:1, has const value.)
      model.layers.0.self_attn.k_proj.weight: StaticValueInfo(model.layers.0.self_attn.k_proj.weight, shape:[16, 32], dtype:1, has const value.)
      model.layers.0.self_attn.k_proj.bias: StaticValueInfo(model.layers.0.self_attn.k_proj.bias, shape:[16], dtype:1, has const value.)
      model.layers.0.self_attn.v_proj.weight: StaticValueInfo(model.layers.0.self_attn.v_proj.weight, shape:[16, 32], dtype:1, has const value.)
      model.layers.0.self_attn.v_proj.bias: StaticValueInfo(model.layers.0.self_attn.v_proj.bias, shape:[16], dtype:1, has const value.)
      model.layers.0.self_attn.rotary_emb.cos_cached: StaticValueInfo(model.layers.0.self_attn.rotary_emb.cos_cached, shape:[512, 4], dtype:1, has const value.)
      model.layers.0.self_attn.rotary_emb.sin_cached: StaticValueInfo(model.layers.0.self_attn.rotary_emb.sin_cached, shape:[512, 4], dtype:1, has const value.)
      model.layers.0.self_attn.dense.weight: StaticValueInfo(model.layers.0.self_attn.dense.weight, shape:[32, 32], dtype:1, has const value.)
      model.layers.0.self_attn.dense.bias: StaticValueInfo(model.layers.0.self_attn.dense.bias, shape:[32], dtype:1, has const value.)
      model.layers.0.mlp.fc1.weight: StaticValueInfo(model.layers.0.mlp.fc1.weight, shape:[16, 32], dtype:1, has const value.)
      model.layers.0.mlp.fc1.bias: StaticValueInfo(model.layers.0.mlp.fc1.bias, shape:[16], dtype:1, has const value.)
      model.layers.0.mlp.fc2.weight: StaticValueInfo(model.layers.0.mlp.fc2.weight, shape:[32, 16], dtype:1, has const value.)
      model.layers.0.mlp.fc2.bias: StaticValueInfo(model.layers.0.mlp.fc2.bias, shape:[32], dtype:1, has const value.)
      model.layers.1.input_layernorm.weight: StaticValueInfo(model.layers.1.input_layernorm.weight, shape:[32], dtype:1, has const value.)
      model.layers.1.input_layernorm.bias: StaticValueInfo(model.layers.1.input_layernorm.bias, shape:[32], dtype:1, has const value.)
      model.layers.1.self_attn.q_proj.weight: StaticValueInfo(model.layers.1.self_attn.q_proj.weight, shape:[32, 32], dtype:1, has const value.)
      model.layers.1.self_attn.q_proj.bias: StaticValueInfo(model.layers.1.self_attn.q_proj.bias, shape:[32], dtype:1, has const value.)
      model.layers.1.self_attn.k_proj.weight: StaticValueInfo(model.layers.1.self_attn.k_proj.weight, shape:[16, 32], dtype:1, has const value.)
      model.layers.1.self_attn.k_proj.bias: StaticValueInfo(model.layers.1.self_attn.k_proj.bias, shape:[16], dtype:1, has const value.)
      model.layers.1.self_attn.v_proj.weight: StaticValueInfo(model.layers.1.self_attn.v_proj.weight, shape:[16, 32], dtype:1, has const value.)
      model.layers.1.self_attn.v_proj.bias: StaticValueInfo(model.layers.1.self_attn.v_proj.bias, shape:[16], dtype:1, has const value.)
      model.layers.1.self_attn.rotary_emb.cos_cached: StaticValueInfo(model.layers.1.self_attn.rotary_emb.cos_cached, shape:[512, 4], dtype:1, has const value.)
      model.layers.1.self_attn.rotary_emb.sin_cached: StaticValueInfo(model.layers.1.self_attn.rotary_emb.sin_cached, shape:[512, 4], dtype:1, has const value.)
      model.layers.1.self_attn.dense.weight: StaticValueInfo(model.layers.1.self_attn.dense.weight, shape:[32, 32], dtype:1, has const value.)
      model.layers.1.self_attn.dense.bias: StaticValueInfo(model.layers.1.self_attn.dense.bias, shape:[32], dtype:1, has const value.)
      model.layers.1.mlp.fc1.weight: StaticValueInfo(model.layers.1.mlp.fc1.weight, shape:[16, 32], dtype:1, has const value.)
      model.layers.1.mlp.fc1.bias: StaticValueInfo(model.layers.1.mlp.fc1.bias, shape:[16], dtype:1, has const value.)
      model.layers.1.mlp.fc2.weight: StaticValueInfo(model.layers.1.mlp.fc2.weight, shape:[32, 16], dtype:1, has const value.)
      model.layers.1.mlp.fc2.bias: StaticValueInfo(model.layers.1.mlp.fc2.bias, shape:[32], dtype:1, has const value.)
      model.final_layernorm.weight: StaticValueInfo(model.final_layernorm.weight, shape:[32], dtype:1, has const value.)
      model.final_layernorm.bias: StaticValueInfo(model.final_layernorm.bias, shape:[32], dtype:1, has const value.)
      unsqueeze: StaticValueInfo(unsqueeze, shape:[1, 7], dtype:7, no const value.)
      _val_39: StaticValueInfo(_val_39, shape:[1], dtype:7, no const value.)
      _val_43: StaticValueInfo(_val_43, shape:[1], dtype:7, no const value.)
      _val_47: StaticValueInfo(_val_47, shape:[1], dtype:7, no const value.)
      _val_51: StaticValueInfo(_val_51, shape:[1], dtype:7, no const value.)
      slice_3: StaticValueInfo(slice_3, shape:[13, 7], dtype:1, no const value.)
      aten_unsqueeze_84_dim_0: StaticValueInfo(aten_unsqueeze_84_dim_0, shape:[], dtype:7, no const value.)
      unsqueeze_3: StaticValueInfo(unsqueeze_3, shape:[13, 1, 7], dtype:1, no const value.)
      aten_unsqueeze_85_dim_0: StaticValueInfo(aten_unsqueeze_85_dim_0, shape:[], dtype:7, no const value.)
      unsqueeze_4: StaticValueInfo(unsqueeze_4, shape:[13, 1, 1, 7], dtype:1, no const value.)
      _val_58: StaticValueInfo(_val_58, shape:[1], dtype:7, no const value.)
      _val_62: StaticValueInfo(_val_62, shape:[1], dtype:7, no const value.)
      _val_66: StaticValueInfo(_val_66, shape:[1], dtype:7, no const value.)
      _val_70: StaticValueInfo(_val_70, shape:[1], dtype:7, no const value.)
      slice_4: StaticValueInfo(slice_4, shape:[13, 1, 1, 7], dtype:1, no const value.)
      _val_72: StaticValueInfo(_val_72, shape:[4], dtype:7, no const value.)
      aten_expand_104_size_1: StaticValueInfo(aten_expand_104_size_1, shape:[4], dtype:7, no const value.)
      expand_1: StaticValueInfo(expand_1, shape:[13, 1, 7, 7], dtype:1, no const value.)
      _val_74: StaticValueInfo(_val_74, shape:[], dtype:1, no const value.)
      rsub: StaticValueInfo(rsub, shape:[13, 1, 7, 7], dtype:1, no const value.)
      _to_copy: StaticValueInfo(_to_copy, shape:[13, 1, 7, 7], dtype:9, no const value.)
      _val_77: StaticValueInfo(_val_77, shape:[], dtype:1, no const value.)
      masked_fill_1: StaticValueInfo(masked_fill_1, shape:[13, 1, 7, 7], dtype:1, no const value.)
      _to_copy_1: StaticValueInfo(_to_copy_1, shape:[13, 1, 7, 7], dtype:9, no const value.)
      expand_2: StaticValueInfo(expand_2, shape:[13, 1, 7, 7], dtype:1, no const value.)
      _val_119: StaticValueInfo(_val_119, shape:[], dtype:1, no const value.)
      masked_fill_2: StaticValueInfo(masked_fill_2, shape:[13, 1, 7, 7], dtype:1, no const value.)
      model_layers_0_1: StaticValueInfo(model_layers_0_1, shape:[13, 2, 7, 8], dtype:1, no const value.)
      model_layers_0_1_1: StaticValueInfo(model_layers_0_1_1, shape:[13, 2, 7, 8], dtype:1, no const value.)
      model_layers_0_1_2: StaticValueInfo(model_layers_0_1_2, shape:[13, 7, 32], dtype:1, no const value.)
      model_layers_1_1: StaticValueInfo(model_layers_1_1, shape:[13, 2, 7, 8], dtype:1, no const value.)
      model_layers_1_1_1: StaticValueInfo(model_layers_1_1_1, shape:[13, 2, 7, 8], dtype:1, no const value.)
      model_layers_1_1_2: StaticValueInfo(model_layers_1_1_2, shape:[13, 7, 32], dtype:1, no const value.)
      model_final_layernorm_1: StaticValueInfo(model_final_layernorm_1, shape:[13, 7, 32], dtype:1, no const value.)
    RefAttributes:
onnxscript/tools/transformers_models/phi_test.py:29: in test_phi_export_cpu
    proto = onnxscript.tools.transformers_models.export_to_onnx(model, *input_tensors)
onnxscript/tools/transformers_models/__init__.py:28: in export_to_onnx
    model_proto = onnxscript.optimizer.optimize(
onnxscript/optimizer/__init__.py:80: in optimize
    inline_functions_with_unused_outputs(model)
onnxscript/optimizer/simple_function_folding.py:231: in inline_functions_with_unused_outputs
    inliner.visit_model(model)
onnxscript/optimizer/simple_function_folding.py:33: in visit_model
    super().visit_model(model)
onnxscript/_legacy_ir/visitor.py:792: in visit_model
    self.visit_graph(model.graph)
onnxscript/_legacy_ir/visitor.py:658: in visit_graph
    replacement = self.visit_node(node)
onnxscript/_legacy_ir/visitor.py:805: in visit_node
    replacement, _ = self.process_function_node(node)
onnxscript/optimizer/simple_function_folding.py:44: in process_function_node
    replacement, new_function = super().process_function_node(node)
onnxscript/_legacy_ir/visitor.py:892: in process_function_node
    replacement = self.visit_node(inner_node)
onnxscript/_legacy_ir/visitor.py:805: in visit_node
    replacement, _ = self.process_function_node(node)
onnxscript/optimizer/simple_function_folding.py:44: in process_function_node
    replacement, new_function = super().process_function_node(node)
onnxscript/_legacy_ir/visitor.py:846: in process_function_node
    actual_input_value_infos = [self.lookup(input) for input in node.input]
onnxscript/_legacy_ir/visitor.py:447: in lookup
    raise ValueError(
E   ValueError: Undefined variable model_embed_dropout_1.
E   Available variables: SubScope 0:
E     Function transformers_models_phi_modeling_phi_PhiModel_model_1:
E       ir.Values:
E         l_input_ids_: StaticValueInfo(l_input_ids_, shape:[13, 7], dtype:7, no const value.)
E         l_attention_mask_: StaticValueInfo(l_attention_mask_, shape:[13, 7], dtype:1, no const value.)
E         model.embed_tokens.weight: StaticValueInfo(model.embed_tokens.weight, shape:[99, 32], dtype:1, has const value.)
E         model.layers.0.input_layernorm.weight: StaticValueInfo(model.layers.0.input_layernorm.weight, shape:[32], dtype:1, has const value.)
E         model.layers.0.input_layernorm.bias: StaticValueInfo(model.layers.0.input_layernorm.bias, shape:[32], dtype:1, has const value.)
E         model.layers.0.self_attn.q_proj.weight: StaticValueInfo(model.layers.0.self_attn.q_proj.weight, shape:[32, 32], dtype:1, has const value.)
E         model.layers.0.self_attn.q_proj.bias: StaticValueInfo(model.layers.0.self_attn.q_proj.bias, shape:[32], dtype:1, has const value.)
E         model.layers.0.self_attn.k_proj.weight: StaticValueInfo(model.layers.0.self_attn.k_proj.weight, shape:[16, 32], dtype:1, has const value.)
E         model.layers.0.self_attn.k_proj.bias: StaticValueInfo(model.layers.0.self_attn.k_proj.bias, shape:[16], dtype:1, has const value.)
E         model.layers.0.self_attn.v_proj.weight: StaticValueInfo(model.layers.0.self_attn.v_proj.weight, shape:[16, 32], dtype:1, has const value.)
E         model.layers.0.self_attn.v_proj.bias: StaticValueInfo(model.layers.0.self_attn.v_proj.bias, shape:[16], dtype:1, has const value.)
E         model.layers.0.self_attn.rotary_emb.cos_cached: StaticValueInfo(model.layers.0.self_attn.rotary_emb.cos_cached, shape:[512, 4], dtype:1, has const value.)
E         model.layers.0.self_attn.rotary_emb.sin_cached: StaticValueInfo(model.layers.0.self_attn.rotary_emb.sin_cached, shape:[512, 4], dtype:1, has const value.)
E         model.layers.0.self_attn.dense.weight: StaticValueInfo(model.layers.0.self_attn.dense.weight, shape:[32, 32], dtype:1, has const value.)
E         model.layers.0.self_attn.dense.bias: StaticValueInfo(model.layers.0.self_attn.dense.bias, shape:[32], dtype:1, has const value.)
E         model.layers.0.mlp.fc1.weight: StaticValueInfo(model.layers.0.mlp.fc1.weight, shape:[16, 32], dtype:1, has const value.)
E         model.layers.0.mlp.fc1.bias: StaticValueInfo(model.layers.0.mlp.fc1.bias, shape:[16], dtype:1, has const value.)
E         model.layers.0.mlp.fc2.weight: StaticValueInfo(model.layers.0.mlp.fc2.weight, shape:[32, 16], dtype:1, has const value.)
E         model.layers.0.mlp.fc2.bias: StaticValueInfo(model.layers.0.mlp.fc2.bias, shape:[32], dtype:1, has const value.)
E         model.layers.1.input_layernorm.weight: StaticValueInfo(model.layers.1.input_layernorm.weight, shape:[32], dtype:1, has const value.)
E         model.layers.1.input_layernorm.bias: StaticValueInfo(model.layers.1.input_layernorm.bias, shape:[32], dtype:1, has const value.)
E         model.layers.1.self_attn.q_proj.weight: StaticValueInfo(model.layers.1.self_attn.q_proj.weight, shape:[32, 32], dtype:1, has const value.)
E         model.layers.1.self_attn.q_proj.bias: StaticValueInfo(model.layers.1.self_attn.q_proj.bias, shape:[32], dtype:1, has const value.)
E         model.layers.1.self_attn.k_proj.weight: StaticValueInfo(model.layers.1.self_attn.k_proj.weight, shape:[16, 32], dtype:1, has const value.)
E         model.layers.1.self_attn.k_proj.bias: StaticValueInfo(model.layers.1.self_attn.k_proj.bias, shape:[16], dtype:1, has const value.)
E         model.layers.1.self_attn.v_proj.weight: StaticValueInfo(model.layers.1.self_attn.v_proj.weight, shape:[16, 32], dtype:1, has const value.)
E         model.layers.1.self_attn.v_proj.bias: StaticValueInfo(model.layers.1.self_attn.v_proj.bias, shape:[16], dtype:1, has const value.)
E         model.layers.1.self_attn.rotary_emb.cos_cached: StaticValueInfo(model.layers.1.self_attn.rotary_emb.cos_cached, shape:[512, 4], dtype:1, has const value.)
E         model.layers.1.self_attn.rotary_emb.sin_cached: StaticValueInfo(model.layers.1.self_attn.rotary_emb.sin_cached, shape:[512, 4], dtype:1, has const value.)
E         model.layers.1.self_attn.dense.weight: StaticValueInfo(model.layers.1.self_attn.dense.weight, shape:[32, 32], dtype:1, has const value.)
E         model.layers.1.self_attn.dense.bias: StaticValueInfo(model.layers.1.self_attn.dense.bias, shape:[32], dtype:1, has const value.)
E         model.layers.1.mlp.fc1.weight: StaticValueInfo(model.layers.1.mlp.fc1.weight, shape:[16, 32], dtype:1, has const value.)
E         model.layers.1.mlp.fc1.bias: StaticValueInfo(model.layers.1.mlp.fc1.bias, shape:[16], dtype:1, has const value.)
E         model.layers.1.mlp.fc2.weight: StaticValueInfo(model.layers.1.mlp.fc2.weight, shape:[32, 16], dtype:1, has const value.)
E         model.layers.1.mlp.fc2.bias: StaticValueInfo(model.layers.1.mlp.fc2.bias, shape:[32], dtype:1, has const value.)
E         model.final_layernorm.weight: StaticValueInfo(model.final_layernorm.weight, shape:[32], dtype:1, has const value.)
E         model.final_layernorm.bias: StaticValueInfo(model.final_layernorm.bias, shape:[32], dtype:1, has const value.)
E         unsqueeze: StaticValueInfo(unsqueeze, shape:[1, 7], dtype:7, no const value.)
E         _val_39: StaticValueInfo(_val_39, shape:[1], dtype:7, no const value.)
E         _val_43: StaticValueInfo(_val_43, shape:[1], dtype:7, no const value.)
E         _val_47: StaticValueInfo(_val_47, shape:[1], dtype:7, no const value.)
E         _val_51: StaticValueInfo(_val_51, shape:[1], dtype:7, no const value.)
E         slice_3: StaticValueInfo(slice_3, shape:[13, 7], dtype:1, no const value.)
E         aten_unsqueeze_84_dim_0: StaticValueInfo(aten_unsqueeze_84_dim_0, shape:[], dtype:7, no const value.)
E         unsqueeze_3: StaticValueInfo(unsqueeze_3, shape:[13, 1, 7], dtype:1, no const value.)
E         aten_unsqueeze_85_dim_0: StaticValueInfo(aten_unsqueeze_85_dim_0, shape:[], dtype:7, no const value.)
E         unsqueeze_4: StaticValueInfo(unsqueeze_4, shape:[13, 1, 1, 7], dtype:1, no const value.)
E         _val_58: StaticValueInfo(_val_58, shape:[1], dtype:7, no const value.)
E         _val_62: StaticValueInfo(_val_62, shape:[1], dtype:7, no const value.)
E         _val_66: StaticValueInfo(_val_66, shape:[1], dtype:7, no const value.)
E         _val_70: StaticValueInfo(_val_70, shape:[1], dtype:7, no const value.)
E         slice_4: StaticValueInfo(slice_4, shape:[13, 1, 1, 7], dtype:1, no const value.)
E         _val_72: StaticValueInfo(_val_72, shape:[4], dtype:7, no const value.)
E         aten_expand_104_size_1: StaticValueInfo(aten_expand_104_size_1, shape:[4], dtype:7, no const value.)
E         expand_1: StaticValueInfo(expand_1, shape:[13, 1, 7, 7], dtype:1, no const value.)
E         _val_74: StaticValueInfo(_val_74, shape:[], dtype:1, no const value.)
E         rsub: StaticValueInfo(rsub, shape:[13, 1, 7, 7], dtype:1, no const value.)
E         _to_copy: StaticValueInfo(_to_copy, shape:[13, 1, 7, 7], dtype:9, no const value.)
E         _val_77: StaticValueInfo(_val_77, shape:[], dtype:1, no const value.)
E         masked_fill_1: StaticValueInfo(masked_fill_1, shape:[13, 1, 7, 7], dtype:1, no const value.)
E         _to_copy_1: StaticValueInfo(_to_copy_1, shape:[13, 1, 7, 7], dtype:9, no const value.)
E         expand_2: StaticValueInfo(expand_2, shape:[13, 1, 7, 7], dtype:1, no const value.)
E         _val_119: StaticValueInfo(_val_119, shape:[], dtype:1, no const value.)
E         masked_fill_2: StaticValueInfo(masked_fill_2, shape:[13, 1, 7, 7], dtype:1, no const value.)
E         model_layers_0_1: StaticValueInfo(model_layers_0_1, shape:[13, 2, 7, 8], dtype:1, no const value.)
E         model_layers_0_1_1: StaticValueInfo(model_layers_0_1_1, shape:[13, 2, 7, 8], dtype:1, no const value.)
E         model_layers_0_1_2: StaticValueInfo(model_layers_0_1_2, shape:[13, 7, 32], dtype:1, no const value.)
E         model_layers_1_1: StaticValueInfo(model_layers_1_1, shape:[13, 2, 7, 8], dtype:1, no const value.)
E         model_layers_1_1_1: StaticValueInfo(model_layers_1_1_1, shape:[13, 2, 7, 8], dtype:1, no const value.)
E         model_layers_1_1_2: StaticValueInfo(model_layers_1_1_2, shape:[13, 7, 32], dtype:1, no const value.)
E         model_final_layernorm_1: StaticValueInfo(model_final_layernorm_1, shape:[13, 7, 32], dtype:1, no const value.)
E       RefAttributes:

Check warning on line 0 in onnxscript.rewriter.llama_rule_sets_test.LlamaRuleSetsTest

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 24 runs failed: test_llama_p0_rule_set_slice_split (onnxscript.rewriter.llama_rule_sets_test.LlamaRuleSetsTest)

artifacts/Test Results (py310-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py310-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py310-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-onnx-weekly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-onnx-weekly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-onnx-weekly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-ort-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-ort-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py311-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py38-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py38-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py38-windows-latest)/pytest.xml [took 0s]
artifacts/Test Results (py39-macos-latest)/pytest.xml [took 0s]
artifacts/Test Results (py39-ubuntu-latest)/pytest.xml [took 0s]
artifacts/Test Results (py39-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Lists differ: ['Split'] != ['Slice', 'Slice']

First differing element 0:
'Split'
'Slice'

Second list contains 1 additional elements.
First extra element 1:
'Slice'

- ['Split']
+ ['Slice', 'Slice']
onnxscript/rewriter/llama_rule_sets_test.py:386: in test_llama_p0_rule_set_slice_split
    self.assertEqual(["Split"], [n.op_type for n in rewritten_model.graph.node])
E   AssertionError: Lists differ: ['Split'] != ['Slice', 'Slice']
E   
E   First differing element 0:
E   'Split'
E   'Slice'
E   
E   Second list contains 1 additional elements.
E   First extra element 1:
E   'Slice'
E   
E   - ['Split']
E   + ['Slice', 'Slice']

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0604_test_matmul_3d (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_matmul_3d' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_matmul_3d.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset13

@script()
def bck_test_matmul_3d(a: FLOAT[2,3,4], b: FLOAT[2,4,3]) -> (FLOAT[2,3,3]):
    c = opset13.MatMul(a, b)
    return c
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_matmul_3d'

The above exception was the direct cause of the following exception:
.nox\test_ort_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_matmul_3d' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_matmul_3d.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_matmul_3d(a: FLOAT[2,3,4], b: FLOAT[2,4,3]) -> (FLOAT[2,3,3]):
E       c = opset13.MatMul(a, b)
E       return c

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0632_test_max_int32 (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_max_int32' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_max_int32.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import INT32
from onnxscript.onnx_opset import opset13

@script()
def bck_test_max_int32(data_0: INT32[3], data_1: INT32[3]) -> (INT32[3]):
    result = opset13.Max(data_0, data_1)
    return result
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_max_int32'

The above exception was the direct cause of the following exception:
.nox\test_ort_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_max_int32' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_max_int32.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import INT32
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_max_int32(data_0: INT32[3], data_1: INT32[3]) -> (INT32[3]):
E       result = opset13.Max(data_0, data_1)
E       return result

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0827_test_reduce_l2_keep_dims_example (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_reduce_l2_keep_dims_example' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_reduce_l2_keep_dims_example.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT, INT64
from onnxscript.onnx_opset import opset18

@script()
def bck_test_reduce_l2_keep_dims_example(data: FLOAT[3,2,2], axes: INT64[1]) -> (FLOAT[3,2,1]):
    reduced = opset18.ReduceL2(data, axes, keepdims=1)
    return reduced
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_reduce_l2_keep_dims_example'

The above exception was the direct cause of the following exception:
.nox\test_torch_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_reduce_l2_keep_dims_example' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_reduce_l2_keep_dims_example.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT64
E   from onnxscript.onnx_opset import opset18
E   
E   @script()
E   def bck_test_reduce_l2_keep_dims_example(data: FLOAT[3,2,2], axes: INT64[1]) -> (FLOAT[3,2,1]):
E       reduced = opset18.ReduceL2(data, axes, keepdims=1)
E       return reduced

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0252_test_clip_splitbounds (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_clip_splitbounds' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_clip_splitbounds.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset13

@script()
def bck_test_clip_splitbounds(x: FLOAT[3], min: FLOAT, max: FLOAT) -> (FLOAT[3]):
    y = opset13.Clip(x, min, max)
    return y
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_clip_splitbounds'

The above exception was the direct cause of the following exception:
.nox\test_ort_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_clip_splitbounds' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_clip_splitbounds.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_clip_splitbounds(x: FLOAT[3], min: FLOAT, max: FLOAT) -> (FLOAT[3]):
E       y = opset13.Clip(x, min, max)
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0282_test_convinteger_without_padding (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_convinteger_without_padding' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_convinteger_without_padding.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import INT32, UINT8
from onnxscript.onnx_opset import opset10

@script()
def bck_test_convinteger_without_padding(x: UINT8[1,1,3,3], w: UINT8[1,1,2,2], x_zero_point: UINT8) -> (INT32[1,1,2,2]):
    y = opset10.ConvInteger(x, w, x_zero_point)
    return y
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_convinteger_without_padding'

The above exception was the direct cause of the following exception:
.nox\test_ort_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_convinteger_without_padding' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_convinteger_without_padding.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import INT32, UINT8
E   from onnxscript.onnx_opset import opset10
E   
E   @script()
E   def bck_test_convinteger_without_padding(x: UINT8[1,1,3,3], w: UINT8[1,1,2,2], x_zero_point: UINT8) -> (INT32[1,1,2,2]):
E       y = opset10.ConvInteger(x, w, x_zero_point)
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0029_test_and3d (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-onnx-weekly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_and3d' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_and3d.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import BOOL
from onnxscript.onnx_opset import opset7

@script()
def bck_test_and3d(x: BOOL[3,4,5], y: BOOL[3,4,5]) -> (BOOL[3,4,5]):
    r_and = opset7.And(x, y)
    return r_and
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_and3d'

The above exception was the direct cause of the following exception:
.nox\test_onnx_weekly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_and3d' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_and3d.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import BOOL
E   from onnxscript.onnx_opset import opset7
E   
E   @script()
E   def bck_test_and3d(x: BOOL[3,4,5], y: BOOL[3,4,5]) -> (BOOL[3,4,5]):
E       r_and = opset7.And(x, y)
E       return r_and

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0122_test_bitwise_not_3d (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_bitwise_not_3d' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_bitwise_not_3d.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import UINT16
from onnxscript.onnx_opset import opset18

@script()
def bck_test_bitwise_not_3d(x: UINT16[3,4,5]) -> (UINT16[3,4,5]):
    bitwise_not = opset18.BitwiseNot(x)
    return bitwise_not
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_bitwise_not_3d'

The above exception was the direct cause of the following exception:
.nox\test_ort_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_bitwise_not_3d' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_bitwise_not_3d.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import UINT16
E   from onnxscript.onnx_opset import opset18
E   
E   @script()
E   def bck_test_bitwise_not_3d(x: UINT16[3,4,5]) -> (UINT16[3,4,5]):
E       bitwise_not = opset18.BitwiseNot(x)
E       return bitwise_not

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0461_test_hardmax_one_hot (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-onnx-weekly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_hardmax_one_hot' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_hardmax_one_hot.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset13

@script()
def bck_test_hardmax_one_hot(x: FLOAT[1,4]) -> (FLOAT[1,4]):
    y = opset13.Hardmax(x)
    return y
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_hardmax_one_hot'

The above exception was the direct cause of the following exception:
.nox\test_onnx_weekly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_hardmax_one_hot' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_hardmax_one_hot.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_hardmax_one_hot(x: FLOAT[1,4]) -> (FLOAT[1,4]):
E       y = opset13.Hardmax(x)
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0126_test_bitwise_or_ui64_bcast_3v1d (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_bitwise_or_ui64_bcast_3v1d' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_bitwise_or_ui64_bcast_3v1d.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import UINT64
from onnxscript.onnx_opset import opset18

@script()
def bck_test_bitwise_or_ui64_bcast_3v1d(x: UINT64[3,4,5], y: UINT64[5]) -> (UINT64[3,4,5]):
    bitwiseor = opset18.BitwiseOr(x, y)
    return bitwiseor
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_bitwise_or_ui64_bcast_3v1d'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_bitwise_or_ui64_bcast_3v1d' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_bitwise_or_ui64_bcast_3v1d.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import UINT64
E   from onnxscript.onnx_opset import opset18
E   
E   @script()
E   def bck_test_bitwise_or_ui64_bcast_3v1d(x: UINT64[3,4,5], y: UINT64[5]) -> (UINT64[3,4,5]):
E       bitwiseor = opset18.BitwiseOr(x, y)
E       return bitwiseor

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0639_test_max_uint64 (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_max_uint64' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_max_uint64.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import UINT64
from onnxscript.onnx_opset import opset13

@script()
def bck_test_max_uint64(data_0: UINT64[3], data_1: UINT64[3]) -> (UINT64[3]):
    result = opset13.Max(data_0, data_1)
    return result
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_max_uint64'

The above exception was the direct cause of the following exception:
.nox\test_ort_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_max_uint64' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_max_uint64.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import UINT64
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_max_uint64(data_0: UINT64[3], data_1: UINT64[3]) -> (UINT64[3]):
E       result = opset13.Max(data_0, data_1)
E       return result

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_1002_test_scatter_with_axis (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_scatter_with_axis' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_scatter_with_axis.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT, INT64
from onnxscript.onnx_opset import opset10

@script()
def bck_test_scatter_with_axis(data: FLOAT[1,5], indices: INT64[1,2], updates: FLOAT[1,2]) -> (FLOAT[1,5]):
    y = opset10.Scatter(data, indices, updates, axis=1)
    return y
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_scatter_with_axis'

The above exception was the direct cause of the following exception:
.nox\test_ort_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_scatter_with_axis' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_scatter_with_axis.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT64
E   from onnxscript.onnx_opset import opset10
E   
E   @script()
E   def bck_test_scatter_with_axis(data: FLOAT[1,5], indices: INT64[1,2], updates: FLOAT[1,2]) -> (FLOAT[1,5]):
E       y = opset10.Scatter(data, indices, updates, axis=1)
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0849_test_reduce_log_sum_exp_do_not_keepdims_random (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py310-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_reduce_log_sum_exp_do_not_keepdims_random' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_reduce_log_sum_exp_do_not_keepdims_random.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import DOUBLE, INT64
from onnxscript.onnx_opset import opset18

@script()
def bck_test_reduce_log_sum_exp_do_not_keepdims_random(data: DOUBLE[3,2,2], axes: INT64[1]) -> (DOUBLE[3,2]):
    reduced = opset18.ReduceLogSumExp(data, axes, keepdims=0)
    return reduced
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.10.11\x64\lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_reduce_log_sum_exp_do_not_keepdims_random'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_reduce_log_sum_exp_do_not_keepdims_random' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_reduce_log_sum_exp_do_not_keepdims_random.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import DOUBLE, INT64
E   from onnxscript.onnx_opset import opset18
E   
E   @script()
E   def bck_test_reduce_log_sum_exp_do_not_keepdims_random(data: DOUBLE[3,2,2], axes: INT64[1]) -> (DOUBLE[3,2]):
E       reduced = opset18.ReduceLogSumExp(data, axes, keepdims=0)
E       return reduced

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0983_test_rnn_seq_length (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py310-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_rnn_seq_length' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_rnn_seq_length.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset14

@script()
def bck_test_rnn_seq_length(X: FLOAT[2,3,3], W: FLOAT[1,5,3], R: FLOAT[1,5,5], B: FLOAT[1,10]) -> (FLOAT[1,3,5]):
    _0, Y_h = opset14.RNN(X, W, R, B, hidden_size=5)
    return Y_h
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.10.11\x64\lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_rnn_seq_length'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_rnn_seq_length' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_rnn_seq_length.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset14
E   
E   @script()
E   def bck_test_rnn_seq_length(X: FLOAT[2,3,3], W: FLOAT[1,5,3], R: FLOAT[1,5,5], B: FLOAT[1,10]) -> (FLOAT[1,3,5]):
E       _0, Y_h = opset14.RNN(X, W, R, B, hidden_size=5)
E       return Y_h

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0048_test_argmax_no_keepdims_example (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py310-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_argmax_no_keepdims_example' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_argmax_no_keepdims_example.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT, INT64
from onnxscript.onnx_opset import opset13

@script()
def bck_test_argmax_no_keepdims_example(data: FLOAT[2,2]) -> (INT64[2]):
    result = opset13.ArgMax(data, axis=1, keepdims=0)
    return result
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.10.11\x64\lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_argmax_no_keepdims_example'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_argmax_no_keepdims_example' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_argmax_no_keepdims_example.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT64
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_argmax_no_keepdims_example(data: FLOAT[2,2]) -> (INT64[2]):
E       result = opset13.ArgMax(data, axis=1, keepdims=0)
E       return result

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0300_test_cos_example (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py310-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_cos_example' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_cos_example.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset7

@script()
def bck_test_cos_example(x: FLOAT[3]) -> (FLOAT[3]):
    y = opset7.Cos(x)
    return y
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.10.11\x64\lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_cos_example'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_cos_example' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_cos_example.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset7
E   
E   @script()
E   def bck_test_cos_example(x: FLOAT[3]) -> (FLOAT[3]):
E       y = opset7.Cos(x)
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0083_test_averagepool_2d_precomputed_pads_count_include_pad (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py310-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_averagepool_2d_precomputed_pads_count_include_pad' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_averagepool_2d_precomputed_pads_count_include_pad.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset19

@script()
def bck_test_averagepool_2d_precomputed_pads_count_include_pad(x: FLOAT[1,1,5,5]) -> (FLOAT[1,1,5,5]):
    y = opset19.AveragePool(x, count_include_pad=1, kernel_shape=[5, 5], pads=[2, 2, 2, 2])
    return y
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.10.11\x64\lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_averagepool_2d_precomputed_pads_count_include_pad'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_averagepool_2d_precomputed_pads_count_include_pad' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_averagepool_2d_precomputed_pads_count_include_pad.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset19
E   
E   @script()
E   def bck_test_averagepool_2d_precomputed_pads_count_include_pad(x: FLOAT[1,1,5,5]) -> (FLOAT[1,1,5,5]):
E       y = opset19.AveragePool(x, count_include_pad=1, kernel_shape=[5, 5], pads=[2, 2, 2, 2])
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0267_test_concat_2d_axis_negative_1 (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_concat_2d_axis_negative_1' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_concat_2d_axis_negative_1.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset13

@script()
def bck_test_concat_2d_axis_negative_1(value0: FLOAT[2,2], value1: FLOAT[2,2]) -> (FLOAT[2,4]):
    output = opset13.Concat(value0, value1, axis=-1)
    return output
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_concat_2d_axis_negative_1'

The above exception was the direct cause of the following exception:
.nox\test_ort_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_concat_2d_axis_negative_1' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_concat_2d_axis_negative_1.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_concat_2d_axis_negative_1(value0: FLOAT[2,2], value1: FLOAT[2,2]) -> (FLOAT[2,4]):
E       output = opset13.Concat(value0, value1, axis=-1)
E       return output

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0757_test_or_bcast4v4d (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_or_bcast4v4d' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_or_bcast4v4d.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import BOOL
from onnxscript.onnx_opset import opset7

@script()
def bck_test_or_bcast4v4d(x: BOOL[1,4,1,6], y: BOOL[3,1,5,6]) -> (BOOL[3,4,5,6]):
    r_or = opset7.Or(x, y)
    return r_or
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_or_bcast4v4d'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_or_bcast4v4d' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_or_bcast4v4d.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import BOOL
E   from onnxscript.onnx_opset import opset7
E   
E   @script()
E   def bck_test_or_bcast4v4d(x: BOOL[1,4,1,6], y: BOOL[3,1,5,6]) -> (BOOL[3,4,5,6]):
E       r_or = opset7.Or(x, y)
E       return r_or

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0364_test_exp (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_exp' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_exp.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT
from onnxscript.onnx_opset import opset13

@script()
def bck_test_exp(x: FLOAT[3,4,5]) -> (FLOAT[3,4,5]):
    y = opset13.Exp(x)
    return y
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_exp'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_exp' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_exp.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_exp(x: FLOAT[3,4,5]) -> (FLOAT[3,4,5]):
E       y = opset13.Exp(x)
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0103_test_bernoulli (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py310-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_bernoulli' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_bernoulli.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import DOUBLE
from onnxscript.onnx_opset import opset15

@script()
def bck_test_bernoulli(x: DOUBLE[10]) -> (DOUBLE[10]):
    y = opset15.Bernoulli(x)
    return y
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.10.11\x64\lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_bernoulli'

The above exception was the direct cause of the following exception:
.nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_bernoulli' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_bernoulli.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import DOUBLE
E   from onnxscript.onnx_opset import opset15
E   
E   @script()
E   def bck_test_bernoulli(x: DOUBLE[10]) -> (DOUBLE[10]):
E       y = opset15.Bernoulli(x)
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0388_test_gather_elements_0 (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-ort-nightly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_gather_elements_0' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_gather_elements_0.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT, INT64
from onnxscript.onnx_opset import opset13

@script()
def bck_test_gather_elements_0(data: FLOAT[2,2], indices: INT64[2,2]) -> (FLOAT[2,2]):
    y = opset13.GatherElements(data, indices, axis=1)
    return y
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_gather_elements_0'

The above exception was the direct cause of the following exception:
.nox\test_ort_nightly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_gather_elements_0' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_gather_elements_0.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT64
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_gather_elements_0(data: FLOAT[2,2], indices: INT64[2,2]) -> (FLOAT[2,2]):
E       y = opset13.GatherElements(data, indices, axis=1)
E       return y

Check warning on line 0 in onnxscript.backend.onnx_export_test.TestOnnxBackEnd

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 8 runs failed: test_export2python_produces_correct_onnx_script_model_0918_test_reduce_sum_square_empty_set (onnxscript.backend.onnx_export_test.TestOnnxBackEnd)

artifacts/Test Results (py311-onnx-weekly-windows-latest)/pytest.xml [took 0s]
Raw output
AssertionError: Unable to import 'tests.onnx_backend_test_code.test_reduce_sum_square_empty_set' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_reduce_sum_square_empty_set.py'))
----
import numpy
from onnx import TensorProto
from onnx.helper import make_tensor
from onnxscript import script, external_tensor
from onnxscript.values import Opset
from onnxscript.onnx_types import FLOAT, INT64
from onnxscript.onnx_opset import opset18

@script()
def bck_test_reduce_sum_square_empty_set(data: FLOAT[2,0,4], axes: INT64[1]) -> (FLOAT[2,1,4]):
    reduced = opset18.ReduceSumSquare(data, axes, keepdims=1)
    return reduced
onnxscript\backend\onnx_export_test.py:115: in extract_functions
    mod = importlib.import_module(import_name)
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_reduce_sum_square_empty_set'

The above exception was the direct cause of the following exception:
.nox\test_onnx_weekly\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
onnxscript\backend\onnx_export_test.py:246: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
onnxscript\backend\onnx_export_test.py:117: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_reduce_sum_square_empty_set' (file: WindowsPath('D:/a/onnxscript/onnxscript/tests/onnx_backend_test_code/test_reduce_sum_square_empty_set.py'))
E   ----
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT64
E   from onnxscript.onnx_opset import opset18
E   
E   @script()
E   def bck_test_reduce_sum_square_empty_set(data: FLOAT[2,0,4], axes: INT64[1]) -> (FLOAT[2,1,4]):
E       reduced = opset18.ReduceSumSquare(data, axes, keepdims=1)
E       return reduced