Skip to content

Be stricter about stack shape when converting primitives#21732

Open
SkySkimmer wants to merge 1 commit intorocq-prover:masterfrom
SkySkimmer:prim-assert-reduced
Open

Be stricter about stack shape when converting primitives#21732
SkySkimmer wants to merge 1 commit intorocq-prover:masterfrom
SkySkimmer:prim-assert-reduced

Conversation

@SkySkimmer
Copy link
Contributor

The stacks should be empty like with constructors.

@SkySkimmer SkySkimmer added the request: full CI Use this label when you want your next push to trigger a full CI. label Mar 9, 2026
@SkySkimmer SkySkimmer marked this pull request as ready for review March 9, 2026 13:17
@SkySkimmer SkySkimmer requested a review from a team as a code owner March 9, 2026 13:17
@coqbot-app coqbot-app bot removed the request: full CI Use this label when you want your next push to trigger a full CI. label Mar 9, 2026
@SkySkimmer SkySkimmer added the needs: progress Work in progress: awaiting action from the author. label Mar 9, 2026

let assert_reduced_constructor s =
let assert_reduced_constructor (s:stack) =
if not @@ CList.is_empty s then
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There could be updates and shifts in there?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why would there be some for int but not for constructor?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's a bug for constructors, we just don't test that enough.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After offline discussion, this is fine because it's ensured by other parts of the code (whd_stack specifically).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We ensure it so well we drop parts of the stack (from ill typed conversion problems expanding ints as lambdas)
https://rocq-prover.zulipchat.com/#narrow/channel/237656-Rocq-devs-.26-plugin-devs/topic/nonsense.20conversion.20problem.20returns.20nonsense.20result/with/578493810

@SkySkimmer
Copy link
Contributor Author

@coqbot ci minimize

@coqbot-app
Copy link
Contributor

coqbot-app bot commented Mar 10, 2026

I have initiated minimization at commit 2c72b4d for the suggested target ci-neural_net_interp as requested.

The stacks should be empty like with constructors.
@SkySkimmer SkySkimmer force-pushed the prim-assert-reduced branch from 2c72b4d to 900b195 Compare March 10, 2026 15:11
@coqbot-app coqbot-app bot added the needs: full CI The latest GitLab pipeline that ran was a light CI. Say "@coqbot run full ci" to get a full CI. label Mar 10, 2026
@coqbot-app
Copy link
Contributor

coqbot-app bot commented Mar 10, 2026

Minimization interrupted by timeout, being automatically continued. Partially Minimized File /home/runner/work/run-coq-bug-minimizer/run-coq-bug-minimizer/builds/coq/coq-failing/_build_ci/neural_net_interp/theories/TransformerLens/HookedTransformer/Module.v in 5h 15m 10s (from ci-neural_net_interp) (interrupted by timeout, being automatically continued) (full log on GitHub Actions - verbose log)
⭐ ⏱️ Partially Minimized Coq File (timeout)
(* -*- mode: coq; coq-prog-args: ("-emacs" "-q" "-w" "+implicit-core-hint-db,+implicits-in-term,+non-reversible-notation,+deprecated-intros-until-0,+deprecated-focus,+unused-intro-pattern,+variable-collision,+unexpected-implicit-declaration,+omega-is-deprecated,+deprecated-instantiate-syntax,+non-recursive,+undeclared-scope,+deprecated-hint-rewrite-without-locality,+deprecated-hint-without-locality,+deprecated-instance-without-locality,+deprecated-typeclasses-transparency-without-locality,-ltac2-missing-notation-var,unsupported-attributes" "-w" "-deprecated-native-compiler-option" "-native-compiler" "ondemand" "-coqlib" "/github/workspace/builds/coq/coq-failing/_install_ci/lib/coq//" "-R" "/github/workspace/builds/coq/coq-failing/_build_ci/neural_net_interp/theories" "NeuralNetInterp" "-Q" "/github/workspace/cwd" "Top" "-Q" "/github/workspace/builds/coq/coq-failing/_install_ci/lib/coq///user-contrib/Ltac2" "Ltac2" "-Q" "/github/workspace/builds/coq/coq-failing/_install_ci/lib/coq///user-contrib/Stdlib" "Stdlib" "-top" "NeuralNetInterp.TransformerLens.HookedTransformer.Module") -*- *)
(* File reduced by coq-bug-minimizer from original input, then from 947 lines to 454 lines *)
(* coqc version 9.3+alpha compiled with OCaml 4.14.0
   coqtop version runner-khfdxfugu-project-4504-concurrent-4:/builds/coq/coq/_build/default,(HEAD detached at 1ebb4b29b7) (1ebb4b29b76e4ef06e0dd82b9405e21a23d27252)
   Expected coqc runtime on this file: 17.768 sec
   Expected coqc peak memory usage on this file: 753820.0 kb *)









Require Corelib.Classes.CMorphisms.
Require Corelib.Classes.Morphisms_Prop.
Require Corelib.Program.Tactics.
Require Corelib.Program.Basics.
Require Corelib.Relations.Relation_Definitions.
Require Corelib.BinNums.IntDef.
Require Corelib.BinNums.NatDef.
Require Corelib.Init.Byte.
Require Corelib.BinNums.PosDef.
Require Corelib.Init.Sumbool.
Require Corelib.Array.PrimArray.
Require Corelib.Numbers.Cyclic.Int63.Sint63Axioms.
Require Corelib.Lists.ListDef.
Require Corelib.Array.ArrayAxioms.
Require Corelib.derive.Derive.
Require NeuralNetInterp.Util.Tactics.IsFloat.
Require NeuralNetInterp.Util.Tactics.IsUint63.
Require NeuralNetInterp.TransformerLens.HookedTransformer.
Import Stdlib.Floats.Floats.
Import Stdlib.Numbers.Cyclic.Int63.Uint63.
Import Stdlib.ZArith.ZArith.
Import NeuralNetInterp.Util.PrimitiveProd.
Import NeuralNetInterp.Util.Tactics.IsUint63.
Import NeuralNetInterp.Util.Tactics.IsFloat.
Import NeuralNetInterp.Util.Tactics.ClearAll.
Export NeuralNetInterp.Util.Default.
Export NeuralNetInterp.Util.Pointed.
Import NeuralNetInterp.Util.Arith.Classes.
Import NeuralNetInterp.Util.Arith.Instances.
Import NeuralNetInterp.Torch.Tensor.
Import NeuralNetInterp.TransformerLens.HookedTransformer.
Import Instances.Truncating.
#[local] Open Scope primproj_scope.

Module Model (cfg : Config).
Definition all_tokens {use_checkpoint : with_default "use_checkpoint" bool true}
    : tensor [(cfg.d_vocab ^ cfg.n_ctx)%core : N; cfg.n_ctx] RawIndexType. exact (let all_toks := Tensor.arange (start:=0) (Uint63.of_Z cfg.d_vocab) in
       let all_tokens := Tensor.cartesian_exp all_toks cfg.n_ctx in
       PArray.maybe_checkpoint all_tokens). Defined.
    Section __.
    End __.
    Section __.
    End __.
    Section __.
    End __.
    Section __.
    End __.
    Section __.
    End __.

  Module HookedTransformer.
    Section __.
      Context {r} {batch : Shape r} {pos}
        (s := (batch ::' pos)%shape)
        (resid_shape := (s ::' cfg.d_model)%shape)
        {A} {coer_float : has_coer float A} {coerZ : has_coer Z A}
        {addA : has_add A} {subA : has_sub A} {mulA : has_mul A} {divA : has_div A}
        {maxA : has_max A}
        {sqrtA : has_sqrt A} {expA : has_exp A}
        {use_checkpoint : with_default "use_checkpoint" bool true}.
Let coerA' (x : float) : A. exact (coer x). Defined.
      #[local] Coercion coerA' : float >-> A.
Let coer_ln_tensor : cfg.ln_tensor float -> cfg.ln_tensor A. exact (match cfg.normalization_type as nt return Config.ln_tensor_gen _ nt float -> Config.ln_tensor_gen _ nt A with
             | Some LN
             | Datatypes.None
               => fun x => x
             end). Defined.
      Definition coer_blocks_params
        := List.map
             (fun '((W_Q, W_K, W_V, W_O,
                      b_Q, b_K, b_V, b_O,
                      ln1_w, ln1_b) : cfg.block_params_type float)
              => ((W_Q:tensor _ A), (W_K:tensor _ A), (W_V:tensor _ A), (W_O:tensor _ A),
                   (b_Q:tensor _ A), (b_K:tensor _ A), (b_V:tensor _ A), (b_O:tensor _ A),
                   coer_ln_tensor ln1_w, coer_ln_tensor ln1_b)).
Definition logits (tokens : tensor s IndexType) : tensor (s ::' cfg.d_vocab_out) A. exact (HookedTransformer.logits
             (A:=A) (n_ctx:=cfg.n_ctx) (normalization_type:=cfg.normalization_type) cfg.eps
             cfg.W_E cfg.W_pos
             (coer_blocks_params cfg.blocks_params)
             (coer_ln_tensor cfg.ln_final_w) (coer_ln_tensor cfg.ln_final_b)
             cfg.W_U cfg.b_U
             tokens). Defined.
    End __.
  End HookedTransformer.

  
  Notation logits_all_tokens
    := (@HookedTransformer.logits 1 [Uint63.of_Z (Z.of_N (@pow N N N N_has_pow cfg.d_vocab cfg.n_ctx))] (of_Z (Z.of_N cfg.n_ctx)) float (@coer_refl float) coer_Z_float float_has_add float_has_sub float_has_mul float_has_div float_has_max float_has_sqrt float_has_exp true (@all_tokens true)).

  Definition logits_all_tokens_concrete : PArray.concrete_tensor _ float
    := PArray.concretize logits_all_tokens.
    Ltac mkApp f x :=
      lazymatch f with
      | fun y => ?f => constr:(match x with y => f end)
      end.

    Ltac set_step _ :=
      match goal with
      | [ H := context G[let s : ?T := ?v in @?f s] |- _ ]
        => lazymatch goal with
           | [ s' := v |- _ ]
             => let fs := mkApp f s' in
                let G' := context G[fs] in
                change G' in (value of H)
           | _
             => let s' := fresh s in
                pose v as s';
                let fs := mkApp f s' in
                let G' := context G[fs] in
                change G' in (value of H)
           end;
           cbv beta iota in H
      | [ H := context G[let s : ?T := ?v in _] |- _ ]
        => assert_fails is_var v;
           lazymatch goal with
           | [ s' := v |- _ ]
             => change v with s' in (value of H)
           | _
             => let s' := fresh s in
                pose v as s';
                change v with s' in (value of H)
           end;
           cbv beta iota in H
      | [ |- context G[let s : ?T := ?v in @?f s] ]
        => lazymatch goal with
           | [ s' := v |- _ ]
             => let fs := mkApp f s' in
                let G' := context G[fs] in
                change G'
           | _
             => let s' := fresh s in
                pose v as s';
                let fs := mkApp f s' in
                let G' := context G[fs] in
                change G'
           end;
           cbv beta iota
      | [ |- context G[let s : ?T := ?v in _] ]
        => assert_fails is_var v;
           lazymatch goal with
           | [ s' := v |- _ ]
             => change v with s'
           | _
             => let s' := fresh s in
                pose v as s';
                change v with s'
           end;
           cbv beta iota
      end.
    Ltac subst_cleanup _ :=
      repeat match goal with
        | [ H := ?v |- _ ] => is_var v; subst H
        | [ H := ?x, H' := ?y |- _ ] => constr_eq x y; change H' with H in *; clear H'
        end.
    Ltac lift_lets _ := repeat set_step (); subst_cleanup ().

    Ltac set_checkpoint _ :=
      repeat match goal with
        | [ H := context G[?x] |- _ ]
          => lazymatch x with PArray.checkpoint _ => idtac | PArray.maybe_checkpoint _ => idtac end;
             lazymatch (eval cbv delta [H] in H) with
             | x => fail
             | _ => idtac
             end;
             let x' := fresh "t" in
             pose x as x';
             let G' := context G[x'] in
             change G' in (value of H)
        | [ |- context G[?x] ]
          => lazymatch x with PArray.checkpoint _ => idtac | PArray.maybe_checkpoint _ => idtac end;
             let x' := fresh "t" in
             pose x as x';
             let G' := context G[x'] in
             change G'
        end.

    Ltac subst_local_cleanup _ :=
      repeat match goal with
        | [ H := [ _ ] : ?T |- _ ]
          => lazymatch T with
             | Shape _ => idtac
             | forall b, Shape _ => idtac
             | Slice.Concrete.Slice _ => idtac
             | IndexType => idtac
             | Slice.Slice _ => idtac
             | PolymorphicOption.option IndexType => idtac
             | PolymorphicOption.option int => idtac
             end;
             subst H
        | [ H := ?v |- _ ]
          => lazymatch v with
             | fun f x => f x => idtac
             | fun x => x => idtac
             | _ => first [ is_uint63 v | is_float v ]
             end;
             subst H
        | [ H := [ fun x => coer x ] : float -> float |- _ ] => cbv in H; subst H
        | [ H := [ coer point ] : float |- _ ] => cbv in H; subst H
        | [ H := [ coer_Z_float _ ] : float |- _ ] => cbv in H; subst H
        | [ H := [ _ ] : ?T |- _ ]
          => lazymatch T with
             | has_one int => idtac
             end;
             cbv in H; subst H
        end;
      cbv beta iota in *.

    Ltac reduce _ :=
      cbv beta iota delta [
          repeat repeat' reduce_axis_m1 map map' reduce_axis_m1' reshape_app_combine broadcast broadcast' reshape_app_combine' RawIndex.uncurry_radd RawIndex.split_radd reshape_snoc_split reshape_app_split reshape_app_split' RawIndex.curry_radd RawIndex.combine_radd RawIndex.hd RawIndex.tl
            adjust_index_for
            Nat.radd
            Classes.sqrt Classes.add Classes.sub Classes.opp Classes.mul Classes.div Classes.sqr Classes.one Classes.zero Classes.exp Classes.eqb Classes.neqb Classes.ltb Classes.leb Classes.matmul
            bool_has_one bool_has_zero bool_has_eqb
            int_has_one Uint63.int_has_ltb PrimInt63.ltb
            Sint63.max Sint63.int_has_leb lesb
            has_default_max_leb
            lift_coer_has_zero lift_coer_has_one
            Z_has_zero Z_has_one
            float_has_zero float_has_one
            coer_refl coer_tensor
            int_has_add
            Tensor.get Tensor.raw_get Slicing.SliceIndex.SliceIndexType.slice Slice.invert_index Slice.concretize PolymorphicOption.Option.sequence_return Slice.step Slice.start Slice.stop Slice.Concrete.length Slicing.SliceIndex.slice Slicing.FancyIndex.slice Slicing.FancyIndex.slice_ Slicing.FancyIndex.broadcast Slicing.FancyIndex.FancyIndexType.broadcast Slice.Concrete.normalize Slice.Concrete.step Slice.Concrete.stop Slice.Concrete.start Slicing.broadcast_one_index'' Slicing.broadcast_one_index'
            Slice.Concrete.step Slice.Concrete.stop Slice.Concrete.base_len
            Slicing.inject_int
            RawIndex.snoc RawIndex.nil
            map_dep map2 map2' map3
            ones tril to_bool
            Shape.tl Shape.hd Shape.snoc Shape.nil
            item int_has_eqb raw_get Shape.broadcast2 Shape.map2
        ] in *;
      cbn beta iota delta [fst snd Primitive.fst Primitive.snd] in *.
    Ltac do_red _ :=
      reduce ();
      lift_lets (); set_checkpoint (); subst_local_cleanup ().
    

    Ltac red_normalization_type_layers _ :=
      cbv beta iota delta [logits_all_tokens_concrete logits_all_tokens HookedTransformer.coer_blocks_params] in *;
      lift_lets (); set_checkpoint ().

    Ltac red_early_layers _ :=
      cbv beta iota delta [HookedTransformer.HookedTransformer.logits HookedTransformer.Unembed.forward HookedTransformer.HookedTransformer.resid_postembed HookedTransformer.HookedTransformer.pos_embed HookedTransformer.HookedTransformer.embed HookedTransformer.Embed.forward HookedTransformer.PosEmbed.forward HookedTransformer.resid_postembed all_tokens] in *;
      lift_lets (); set_checkpoint ().
    Ltac red_blocks_layers_1 _ :=
      cbv beta iota delta [HookedTransformer.HookedTransformer.blocks_cps HookedTransformer.HookedTransformer.blocks] in *;
      lift_lets (); set_checkpoint ().

    Ltac red_blocks_layers_2 _ :=
      cbv beta iota delta [TransformerBlock.ln1 LayerNorm.forward TransformerBlock.query_input TransformerBlock.key_input TransformerBlock.value_input TransformerBlock.add_head_dimension LayerNorm.scale LayerNorm.rescale LayerNorm.linpart LayerNorm.postrescale] in *;
      lift_lets (); set_checkpoint (); do_red ().
    Ltac red_blocks_layers_3 _ :=
      cbv beta iota delta [Attention.attn_out Attention.z Attention.v Attention.pattern] in *;
      lift_lets (); set_checkpoint (); do_red ().
    Ltac red_blocks_layers_4 _ :=
      cbv beta iota delta [HookedTransformer.Attention.masked_attn_scores HookedTransformer.Attention.attn_scores Attention.einsum_input Attention.q Attention.k] in *;
      lift_lets (); set_checkpoint (); do_red ().
    Ltac red_blocks_layers_5 _ :=
      cbv [Attention.apply_causal_mask] in *;
      repeat (cbv beta iota zeta in *; do_red ()).
    Ltac red_blocks_layers_6 _ :=
      cbv beta iota delta [softmax_dim_m1] in *;
      lift_lets (); do_red ().
    Ltac red_ops _ :=
      cbv beta iota delta [Bool.where_ where_
                             tensor_add tensor_sub tensor_mul tensor_div_by tensor_sqrt tensor_matmul diagonal mm
                             float_has_add float_has_sub float_has_mul float_has_div float_has_exp float_has_sqrt
                             coer coer_Z_float] in *;
      do_red ().
    Ltac red_sum _ :=
      cbv [Wf_Uint63.Reduction.sum Wf_Uint63.map_reduce Wf_Uint63.for_loop_lt Classes.eqb PrimInt63.eqb Monad.bind Wf_Uint63.get Wf_Uint63.LoopBody_Monad Wf_Uint63.run_body Wf_Uint63.bind Wf_Uint63.set Wf_Uint63.update Wf_Uint63.Reduction.mean Classes.int_div Uint63.int_has_int_div Classes.div coer coer_Z_float Classes.sub int_has_sub] in *.

    Ltac red_late_layers_1 _ :=
      cbv beta iota delta [HookedTransformer.HookedTransformer.ln_final HookedTransformer.HookedTransformer.unembed LayerNorm.forward HookedTransformer.Unembed.forward Unembed.forward] in *;
      lift_lets (); set_checkpoint ().
    Ltac red_late_layers_2 _ :=
      cbv beta iota delta [LayerNorm.linpart LayerNorm.scale LayerNorm.rescale LayerNorm.postrescale] in *;
      lift_lets (); set_checkpoint (); do_red ().

  Derive logits_all_tokens_concrete_opt
    SuchThat (logits_all_tokens_concrete_opt = logits_all_tokens_concrete)
    As logits_all_tokens_concrete_opt_eq.
  Proof.
    Unshelve.
    2:{
 pose proof cfg.blocks_params as blocks_params.
        pose proof cfg.ln_final_w as ln_final_w.
        pose proof cfg.ln_final_b as ln_final_b.
        destruct cfg.normalization_type as [nt|]; [ destruct nt | ].
        all: shelve.
}
    red_normalization_type_layers ().
    subst blocks_params ln_final_b ln_final_w.
    set (blocks_params := cfg.blocks_params) in *.
    set (ln_final_w := cfg.ln_final_w) in *.
    set (ln_final_b := cfg.ln_final_b) in *.
    clearbody blocks_params ln_final_w ln_final_b.
    assert_succeeds destruct cfg.normalization_type.
    cbv beta zeta in *.
    red_early_layers ().
    red_blocks_layers_1 ().
    subst_local_cleanup ().
    rewrite List.firstn_all, List.map_map.
    lazymatch goal with
    | [ |- _ = ?concretize (List.fold_right ?k ?f ?ls ?resid) ]
      => let f' := open_constr:(_) in
         let ls' := open_constr:(_) in
         let Hf := fresh in
         let Hls := fresh in
         let f'' := fresh in
         pose f' as f'';
         assert (Hf : forall x, f'' x = f x /\ f'' = f);
         [ subst f'' | replace f with f''; [ subst f'' | clearbody f''; clear -Hf; abstract apply Hf, broadcast', point ]  ];
         [ | replace ls with ls'
         | .. ]
    end.
    3:{
 repeat match goal with H : _ |- _ => clear H end.
        instantiate (1:=ltac:(destruct cfg.normalization_type as [nt|]; [ destruct nt | ])).
        destruct cfg.normalization_type as [nt|]; [ destruct nt | ].
        all: cbv beta iota zeta; subst_local_cleanup ().
        all: cbv beta iota delta [TransformerBlock.attn_only_out]; lift_lets (); set_checkpoint ().
        all: match goal with
             | [ |- _ = List.map ?f _ ]
               => let f' := open_constr:(_) in
                  let f'' := fresh in
                  pose f' as f'';
                  let H := fresh in
                  assert (H : forall x y, f'' x y = f x y /\ f'' = f);
                  [ subst f''; intros ??
                  | replace f with f''; [ subst f''; shelve | clearbody f''; clear -H; shelve ] ]
             end.
        all: lift_lets (); set_checkpoint ().
        all: red_blocks_layers_2 ().
        all: red_blocks_layers_3 ().
        all: red_blocks_layers_4 ().
        all: red_blocks_layers_5 ().
        all: red_blocks_layers_6 ().
        all: red_ops ().
        all: red_sum ().
        all: clear_all.
        all: repeat lazymatch goal with
               | [ H := ?x |- _ ]
                 => revert H;
                    lazymatch goal with
                    | [ |- let H := ?x in ?lhs = ?rhs /\ ?lhs' = ?rhs' ]
                      => change (lhs = (let H := x in rhs) /\ lhs' = (let H := x in rhs'))
                    end
               end.
        all: lazymatch goal with |- ?e ?x ?y = _ /\ _ => revert x y end.
        Unshelve.
        all: shelve_unifiable.
        all: lazymatch goal with
             | [ |- forall x y, ?lhs x y = @?rhs x y /\ ?lhs = _ ]
               => change (forall x y, lhs x y = rhs x y /\ lhs = rhs); instantiate (1:=rhs); split; abstract reflexivity
             | _ => idtac
             end.
        all: cbv beta iota.
        all: repeat match goal with H : _ |- ?ev = _ => is_evar ev; clear H end.
        all: lazymatch goal with
             | [ |- ?ev = List.map _ _ ]
               => is_evar ev;
                  let rhs := lazymatch goal with |- _ = ?v => v end in
                  instantiate (1:=rhs); abstract reflexivity
             | [ H : forall x y, _ = _ /\ _ = _ |- _ = _ ] => abstract (apply H; repeat split; apply broadcast'; exact point)
             | _ => idtac
             end.
        shelve.
}
    {
 repeat match goal with H : _ |- _ => clear H end.
      instantiate (1:=ltac:(destruct cfg.normalization_type as [nt|]; [ destruct nt | ])).
      destruct cfg.normalization_type as [nt|]; [ destruct nt | ].
      all: intros.
      all: lift_lets (); subst_local_cleanup ().
      all: repeat match goal with H := Some _ |- _ => subst H end.
      all: repeat match goal with H := None |- _ => subst H end.
      all: cbv beta iota zeta.
      all: do_red ().
      all: red_late_layers_1 ().
      all: red_late_layers_2 ().
      all: red_ops ().
      all: red_sum ().
      all: do_red ().
      all: lazymatch goal with
           | [ |- context[Definitions.PrimFloat.of_Z ?z] ]
             => pose (Definitions.PrimFloat.of_Z z) as z';
                move z' at top;
                repeat match goal with
                  | [ H := context G[Definitions.PrimFloat.of_Z z] |- _ ]
                    => let G' := context G[z'] in
                       change G' in (value of H)
                  | [ |- context G[Definitions.PrimFloat.of_Z z] ]
                    => let G' := context G[z'] in
                       change G'
                  end
           | _ => idtac
           end.
      all: clear_all.
      all: repeat lazymatch goal with
             | [ H := ?x |- _ ]
               => revert H;
                  lazymatch goal with
                  | [ |- let H := ?x in ?lhs = ?rhs /\ ?lhs' = ?rhs' ]
                    => change (lhs = (let H := x in rhs) /\ lhs' = (let H := x in rhs'))
                  end
             end.
      all: lazymatch goal with |- ?e ?x = _ /\ _ => revert x end.
      Unshelve.
      all: shelve_unifiable.
      all: lazymatch goal with
           | [ |- forall x, ?lhs x = @?rhs x /\ ?lhs = _ ]
             => change (forall x, lhs x = rhs x /\ lhs = rhs); instantiate (1:=rhs); split; abstract reflexivity
           | _ => idtac
           end.
      all: cbv beta iota.
      all: shelve.
}
    all: cbv beta.
    all: do_red ().
    all: clear_all.
    cbv beta iota zeta in embed, pos_embed.
    destruct cfg.normalization_type as [nt|]; [ destruct nt | ].
    all: repeat match goal with H := Some _ |- _ => subst H end.
    all: repeat match goal with H := None |- _ => subst H end.
    all: cbv beta iota in *.
    all: red_ops (); do_red ().
    all: try subst logits_all_tokens_concrete_opt.
    all: repeat lazymatch goal with
           | [ H := ?x |- _ ]
             => revert H;
                lazymatch goal with
                | [ |- let H := ?x in ?lhs = ?rhs ]
                  => change (lhs = (let H := x in rhs))
                end
           end.
    all: lazymatch goal with
         | [ |- ?ev = ?v ]
           => tryif is_evar ev
             then instantiate (1:=v); abstract reflexivity
             else idtac
         end.
  Qed.
🛠️ Intermediate Coq File (useful for debugging if minimization did not go as far as you wanted)
🛠️ 📜 Intermediate Coq File log (useful for debugging if minimization did not go as far as you wanted)
📜 Build Log (contains the Coq error message) (truncated to last 8.0KiB; full 5.5MiB file on GitHub Actions Artifacts under build.log)
he recommended minimum of 65520 kbytes
ROCQ compile theories/TransformerLens/HookedTransformer/Module.v
MINIMIZER_DEBUG_EXTRA: coqc: /github/workspace/builds/coq/coq-failing/_install_ci/bin///rocq
MINIMIZER_DEBUG_EXTRA: original invocation: '' 
MINIMIZER_DEBUG_EXTRA: new invocation: /github/workspace/builds/coq/coq-failing/_install_ci/bin/rocq.orig compile -q -w +implicit-core-hint-db\,+implicits-in-term\,+non-reversible-notation\,+deprecated-intros-until-0\,+deprecated-focus\,+unused-intro-pattern\,+variable-collision\,+unexpected-implicit-declaration\,+omega-is-deprecated\,+deprecated-instantiate-syntax\,+non-recursive\,+undeclared-scope\,+deprecated-hint-rewrite-without-locality\,+deprecated-hint-without-locality\,+deprecated-instance-without-locality\,+deprecated-typeclasses-transparency-without-locality\,-ltac2-missing-notation-var\,unsupported-attributes -w -deprecated-native-compiler-option -native-compiler ondemand -R /github/workspace/builds/coq/coq-failing/_build_ci/neural_net_interp/theories NeuralNetInterp theories/TransformerLens/HookedTransformer/Module.v 
MINIMIZER_DEBUG_EXTRA: coqpath: 
MINIMIZER_DEBUG_EXTRA: ocamlpath: /github/workspace/builds/coq/coq-failing/_install_ci/lib:
MINIMIZER_DEBUG_EXTRA: pwd: PWD=/github/workspace/builds/coq/coq-failing/_build_ci/neural_net_interp
MINIMIZER_DEBUG_EXTRA: exec: /github/workspace/builds/coq/coq-failing/_install_ci/bin/rocq.orig compile -q -w +implicit-core-hint-db\,+implicits-in-term\,+non-reversible-notation\,+deprecated-intros-until-0\,+deprecated-focus\,+unused-intro-pattern\,+variable-collision\,+unexpected-implicit-declaration\,+omega-is-deprecated\,+deprecated-instantiate-syntax\,+non-recursive\,+undeclared-scope\,+deprecated-hint-rewrite-without-locality\,+deprecated-hint-without-locality\,+deprecated-instance-without-locality\,+deprecated-typeclasses-transparency-without-locality\,-ltac2-missing-notation-var\,unsupported-attributes -w -deprecated-native-compiler-option -native-compiler ondemand -R /github/workspace/builds/coq/coq-failing/_build_ci/neural_net_interp/theories NeuralNetInterp theories/TransformerLens/HookedTransformer/Module.v 
MINIMIZER_DEBUG_EXTRA: coqlib: Warning: Deprecated environment variable COQLIB, use ROCQLIB instead.
/github/workspace/builds/coq/coq-failing/_install_ci/lib/coq//
MINIMIZER_DEBUG: info: /tmp/tmp-coqbot-minimizer.gVThCpLDJW
MINIMIZER_DEBUG: files:  theories/TransformerLens/HookedTransformer/Module.v /github/workspace/builds/coq/coq-failing/_build_ci/neural_net_interp/theories/TransformerLens/HookedTransformer/Module.v
Warning, feedback message received but no listener to handle it!
Warning: Deprecated environment variable COQLIB, use ROCQLIB instead.
[deprecated-coq-env-var,deprecated-since-9.0,deprecated,default]Warning, feedback message received but no listener to handle it!
Warning: Deprecated environment variable COQLIB, use ROCQLIB instead.
[deprecated-coq-env-var,deprecated-since-9.0,deprecated,default]
Warning: Deprecated environment variable COQCORELIB,
use ROCQRUNTIMELIB instead.
[deprecated-coq-env-var,deprecated-since-9.0,deprecated,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 1, characters 5-8:
Warning: "From Coq" has been replaced by "From Stdlib".
[deprecated-from-Coq,deprecated-since-9.0,deprecated,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 1, characters 0-24:
Warning: Using Vector.t is known to be technically difficult, see
<https://github.com/coq/stdlib/blob/master/theories/Vectors/Vector.v>.
[warn-library-file-stdlib-vector,stdlib-vector,warn-library-file,user-warn,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 2, characters 5-8:
Warning: "From Coq" has been replaced by "From Stdlib".
[deprecated-from-Coq,deprecated-since-9.0,deprecated,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 3, characters 5-19:
Warning: "From Coq" has been replaced by "From Stdlib".
[deprecated-from-Coq,deprecated-since-9.0,deprecated,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 4, characters 5-8:
Warning: "From Coq" has been replaced by "From Stdlib".
[deprecated-from-Coq,deprecated-since-9.0,deprecated,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 8, characters 0-65:
Warning:
New coercion path [N.to_nat; N.of_nat] : N >-> N is not definitionally an identity function.
New coercion path [N.of_nat; N.to_nat] : nat >-> nat is not definitionally an identity function.
[ambiguous-paths,coercions,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 9, characters 0-49:
Warning:
New coercion path [coer_tensor] : tensor >-> tensor is not definitionally an identity function.
[ambiguous-paths,coercions,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 11, characters 0-28:
Warning:
New coercion path [Definitions.PrimFloat.of_Z; Definitions.PrimFloat.to_Q] : Z >-> QArith_base.Q is ambiguous with existing 
[QArith_base.inject_Z] : Z >-> QArith_base.Q.
[ambiguous-paths,coercions,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 17, characters 0-8:
Warning: Use of "Notation" keyword for abbreviations is deprecated, use
"Abbreviation" instead.
[notation-for-abbreviation,deprecated-since-9.2,deprecated,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 319, characters 2-10:
Warning: Use of "Notation" keyword for abbreviations is deprecated, use
"Abbreviation" instead.
[notation-for-abbreviation,deprecated-since-9.2,deprecated,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 483, characters 2-10:
Warning: Use of "Notation" keyword for abbreviations is deprecated, use
"Abbreviation" instead.
[notation-for-abbreviation,deprecated-since-9.2,deprecated,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 607, characters 6-61:
Warning: Set ... Append is not supported.
[set-append-deprecated,deprecated-since-9.1,deprecated,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 607, characters 6-61:
Warning: Set ... Append is not supported.
[set-append-deprecated,deprecated-since-9.1,deprecated,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 652, characters 6-60:
Warning: Set ... Append is not supported.
[set-append-deprecated,deprecated-since-9.1,deprecated,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 652, characters 6-60:
Warning: Set ... Append is not supported.
[set-append-deprecated,deprecated-since-9.1,deprecated,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 773, characters 2-156:
Warning: Use of "SuchThat" and "As" in "Derive" is deprecated; replace them
respectively by "in" and "as".
[deprecated-derive-suchthat,deprecated-since-9.0,deprecated,default]
File "./theories/TransformerLens/HookedTransformer/Module.v", line 930, characters 2-6:
Error: Anomaly "conversion was given unreduced term (FConstruct)."
Please report at http://rocq-prover.org/bugs/.

Command exited with non-zero status 129
theories/TransformerLens/HookedTransformer/Module.vo (real: 17.04, user: 16.87, sys: 0.16, mem: 823312 ko)
make[1]: *** [Makefile.coq:815: theories/TransformerLens/HookedTransformer/Module.vo] Error 129
make[1]: *** [theories/TransformerLens/HookedTransformer/Module.vo] Deleting file 'theories/TransformerLens/HookedTransformer/Module.glob'
make: *** [Makefile:42: invoke-coqmakefile] Error 2
+ code=2
+ printf '\n%s exit code: %s\n' neural_net_interp 2
+ '[' neural_net_interp '!=' stdlib_test ']'
+ echo 'Aggregating timing log...'
Aggregating timing log...
+ echo

+ tools/make-one-time-file.py --real _build_ci/neural_net_interp.log
    Time |  Peak Mem | File Name                                  
------------------------------------------------------------------
0m17.04s | 823312 ko | Total Time / Peak Mem                      
------------------------------------------------------------------
0m17.04s | 823312 ko | TransformerLens/HookedTransformer/Module.vo
+ '[' '' ']'
+ exit 2
/github/workspace/builds/coq /github/workspace
::endgroup::
📜 🔎 Minimization Log (truncated to last 8.0KiB; full 1.5MiB file on GitHub Actions Artifacts under bug.log)
tmptugqp0cv/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 771, characters 6-61:
Warning: Set ... Append is not supported.
[set-append-deprecated,deprecated-since-9.1,deprecated,default]
File "/tmp/tmptugqp0cv/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 771, characters 6-61:
Warning: Set ... Append is not supported.
[set-append-deprecated,deprecated-since-9.1,deprecated,default]
File "/tmp/tmptugqp0cv/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 815, characters 6-60:
Warning: Set ... Append is not supported.
[set-append-deprecated,deprecated-since-9.1,deprecated,default]
File "/tmp/tmptugqp0cv/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 815, characters 6-60:
Warning: Set ... Append is not supported.
[set-append-deprecated,deprecated-since-9.1,deprecated,default]
File "/tmp/tmptugqp0cv/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 891, characters 2-156:
Warning: Use of "SuchThat" and "As" in "Derive" is deprecated; replace them
respectively by "in" and "as".
[deprecated-derive-suchthat,deprecated-since-9.0,deprecated,default]
File "/tmp/tmptugqp0cv/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 894, characters 0-9:
Error: Cannot admit: the statement has unresolved existential variables.


�[93mIntermediate code not saved.�[0m
Failed to do everything at once; trying one at a time.
Admitting definitions unsuccessful.
No successful changes.

I will now attempt to admit lemmas with admit. Defined with Proof using
�[92m
Admitting lemmas successful.�[0m
Failed to do everything at once; trying one at a time.
Admitting lemmas unsuccessful.
No successful changes.

I will now attempt to admit definitions with admit. Defined with Proof using

Non-fatal error: Failed to admit definitions and preserve the error.  
The new error was:
Deprecated environment variable COQCORELIB, use ROCQRUNTIMELIB instead.
Warning: Deprecated environment variable COQCORELIB,
use ROCQRUNTIMELIB instead.
[deprecated-coq-env-var,deprecated-since-9.0,deprecated,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 232, characters 0-27:
Warning: Alternatives to Fin.t are available, see
<https://github.com/coq/stdlib/blob/master/theories/Vectors/Fin.v>.
[warn-library-file-stdlib-vector,stdlib-vector,warn-library-file,user-warn,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 235, characters 0-33:
Warning: Using Vector.t is known to be technically difficult, see
<https://github.com/coq/stdlib/blob/master/theories/Vectors/Vector.v>.
[warn-library-file-stdlib-vector,stdlib-vector,warn-library-file,user-warn,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 245, characters 0-34:
Warning: Using Vector.t is known to be technically difficult, see
<https://github.com/coq/stdlib/blob/master/theories/Vectors/Vector.v>.
[warn-library-file-stdlib-vector,stdlib-vector,warn-library-file,user-warn,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 247, characters 0-32:
Warning: Using Vector.t is known to be technically difficult, see
<https://github.com/coq/stdlib/blob/master/theories/Vectors/Vector.v>.
[warn-library-file-stdlib-vector,stdlib-vector,warn-library-file,user-warn,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 248, characters 0-30:
Warning: Using Vector.t is known to be technically difficult, see
<https://github.com/coq/stdlib/blob/master/theories/Vectors/Vector.v>.
[warn-library-file-stdlib-vector,stdlib-vector,warn-library-file,user-warn,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 304, characters 0-34:
Warning: Library File Stdlib.ZArith.ZArith_base is deprecated
since Stdlib 9.0. use ZArith instead
[deprecated-library-file-since-Stdlib-9.0,deprecated-since-Stdlib-9.0,deprecated-library-file,deprecated,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 448, characters 7-20:
Warning: Coq.Init.Ltac has been replaced by Corelib.Init.Ltac.
[deprecated-dirpath-Coq,deprecated-since-9.0,deprecated,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 457, characters 0-79:
Warning:
New coercion path [N.to_nat; N.of_nat] : N >-> N is not definitionally an identity function.
New coercion path [N.of_nat; N.to_nat] : nat >-> nat is not definitionally an identity function.
[ambiguous-paths,coercions,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 458, characters 0-36:
Warning:
New coercion path [coer_tensor] : tensor >-> tensor is not definitionally an identity function.
[ambiguous-paths,coercions,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 460, characters 0-28:
Warning:
New coercion path [Definitions.PrimFloat.of_Z; Definitions.PrimFloat.to_Q] : Z >-> QArith_base.Q is ambiguous with existing 
[QArith_base.inject_Z] : Z >-> QArith_base.Q.
[ambiguous-paths,coercions,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 466, characters 0-8:
Warning: Use of "Notation" keyword for abbreviations is deprecated, use
"Abbreviation" instead.
[notation-for-abbreviation,deprecated-since-9.2,deprecated,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 644, characters 2-10:
Warning: Use of "Notation" keyword for abbreviations is deprecated, use
"Abbreviation" instead.
[notation-for-abbreviation,deprecated-since-9.2,deprecated,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 647, characters 2-10:
Warning: Use of "Notation" keyword for abbreviations is deprecated, use
"Abbreviation" instead.
[notation-for-abbreviation,deprecated-since-9.2,deprecated,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 771, characters 6-61:
Warning: Set ... Append is not supported.
[set-append-deprecated,deprecated-since-9.1,deprecated,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 771, characters 6-61:
Warning: Set ... Append is not supported.
[set-append-deprecated,deprecated-since-9.1,deprecated,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 815, characters 6-60:
Warning: Set ... Append is not supported.
[set-append-deprecated,deprecated-since-9.1,deprecated,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 815, characters 6-60:
Warning: Set ... Append is not supported.
[set-append-deprecated,deprecated-since-9.1,deprecated,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 891, characters 2-156:
Warning: Use of "SuchThat" and "As" in "Derive" is deprecated; replace them
respectively by "in" and "as".
[deprecated-derive-suchthat,deprecated-since-9.0,deprecated,default]
File "/tmp/tmp663wmuld/NeuralNetInterp/TransformerLens/HookedTransformer/Module.v", line 894, characters 0-5:
Error: Attempt to save an incomplete proof.


�[93mIntermediate code not saved.�[0m
Failed to do everything at once; trying one at a time.
Admitting definitions unsuccessful.
No successful changes.

I will now attempt to add Proof using lines
�[92m
Adding Proof using lines successful.�[0m
Failed to do everything at once; trying one at a time.
Adding Proof using lines unsuccessful.
No successful changes.

I will now attempt to export modules
Module exportation successful

I will now attempt to split imports and exports
Import/Export splitting successful

I will now attempt to split := definitions
One-line definition splitting successful

I will now attempt to lift Requires to the top of the file while inserting option settings

I will now attempt to lift Requires to the top of the file while inserting option settings

I will now attempt to remove all lines, one at a time

If you have any comments on your experience of the minimizer, please share them in a reply (possibly tagging @JasonGross).
If you believe there's a bug in the bug minimizer, please report it on the bug minimizer issue tracker.

@coqbot-app
Copy link
Contributor

coqbot-app bot commented Mar 10, 2026

Minimized File in 4m 4s (from ci-neural_net_interp) (full log on GitHub Actions)

We are collecting data on the user experience of the Coq Bug Minimizer.
If you haven't already filled the survey for this PR, please fill out our short survey!

🌟 Minimized Coq File (consider adding this file to the test-suite)
(* -*- mode: coq; coq-prog-args: ("-emacs" "-q" "-w" "+implicit-core-hint-db,+implicits-in-term,+non-reversible-notation,+deprecated-intros-until-0,+deprecated-focus,+unused-intro-pattern,+variable-collision,+unexpected-implicit-declaration,+omega-is-deprecated,+deprecated-instantiate-syntax,+non-recursive,+undeclared-scope,+deprecated-hint-rewrite-without-locality,+deprecated-hint-without-locality,+deprecated-instance-without-locality,+deprecated-typeclasses-transparency-without-locality,-ltac2-missing-notation-var,unsupported-attributes" "-w" "-deprecated-native-compiler-option" "-native-compiler" "ondemand" "-coqlib" "/github/workspace/builds/coq/coq-failing/_install_ci/lib/coq//" "-R" "/github/workspace/builds/coq/coq-failing/_build_ci/neural_net_interp/theories" "NeuralNetInterp" "-Q" "/github/workspace/cwd" "Top" "-Q" "/github/workspace/builds/coq/coq-failing/_install_ci/lib/coq///user-contrib/Ltac2" "Ltac2" "-Q" "/github/workspace/builds/coq/coq-failing/_install_ci/lib/coq///user-contrib/Stdlib" "Stdlib" "-top" "NeuralNetInterp.TransformerLens.HookedTransformer.Module") -*- *)
(* File reduced by coq-bug-minimizer from original input, then from 947 lines to 454 lines *)
(* coqc version 9.3+alpha compiled with OCaml 4.14.0
   coqtop version runner-khfdxfugu-project-4504-concurrent-4:/builds/coq/coq/_build/default,(HEAD detached at 1ebb4b29b7) (1ebb4b29b76e4ef06e0dd82b9405e21a23d27252)
   Expected coqc runtime on this file: 17.768 sec
   Expected coqc peak memory usage on this file: 753820.0 kb *)









Require Corelib.Classes.CMorphisms.
Require Corelib.Classes.Morphisms_Prop.
Require Corelib.Program.Tactics.
Require Corelib.Program.Basics.
Require Corelib.Relations.Relation_Definitions.
Require Corelib.BinNums.IntDef.
Require Corelib.BinNums.NatDef.
Require Corelib.Init.Byte.
Require Corelib.BinNums.PosDef.
Require Corelib.Init.Sumbool.
Require Corelib.Array.PrimArray.
Require Corelib.Numbers.Cyclic.Int63.Sint63Axioms.
Require Corelib.Lists.ListDef.
Require Corelib.Array.ArrayAxioms.
Require Corelib.derive.Derive.
Require NeuralNetInterp.Util.Tactics.IsFloat.
Require NeuralNetInterp.Util.Tactics.IsUint63.
Require NeuralNetInterp.TransformerLens.HookedTransformer.
Import Stdlib.Floats.Floats.
Import Stdlib.Numbers.Cyclic.Int63.Uint63.
Import Stdlib.ZArith.ZArith.
Import NeuralNetInterp.Util.PrimitiveProd.
Import NeuralNetInterp.Util.Tactics.IsUint63.
Import NeuralNetInterp.Util.Tactics.IsFloat.
Import NeuralNetInterp.Util.Tactics.ClearAll.
Export NeuralNetInterp.Util.Default.
Export NeuralNetInterp.Util.Pointed.
Import NeuralNetInterp.Util.Arith.Classes.
Import NeuralNetInterp.Util.Arith.Instances.
Import NeuralNetInterp.Torch.Tensor.
Import NeuralNetInterp.TransformerLens.HookedTransformer.
Import Instances.Truncating.
#[local] Open Scope primproj_scope.

Module Model (cfg : Config).
Definition all_tokens {use_checkpoint : with_default "use_checkpoint" bool true}
    : tensor [(cfg.d_vocab ^ cfg.n_ctx)%core : N; cfg.n_ctx] RawIndexType. exact (let all_toks := Tensor.arange (start:=0) (Uint63.of_Z cfg.d_vocab) in
       let all_tokens := Tensor.cartesian_exp all_toks cfg.n_ctx in
       PArray.maybe_checkpoint all_tokens). Defined.
    Section __.
    End __.
    Section __.
    End __.
    Section __.
    End __.
    Section __.
    End __.
    Section __.
    End __.

  Module HookedTransformer.
    Section __.
      Context {r} {batch : Shape r} {pos}
        (s := (batch ::' pos)%shape)
        (resid_shape := (s ::' cfg.d_model)%shape)
        {A} {coer_float : has_coer float A} {coerZ : has_coer Z A}
        {addA : has_add A} {subA : has_sub A} {mulA : has_mul A} {divA : has_div A}
        {maxA : has_max A}
        {sqrtA : has_sqrt A} {expA : has_exp A}
        {use_checkpoint : with_default "use_checkpoint" bool true}.
Let coerA' (x : float) : A. exact (coer x). Defined.
      #[local] Coercion coerA' : float >-> A.
Let coer_ln_tensor : cfg.ln_tensor float -> cfg.ln_tensor A. exact (match cfg.normalization_type as nt return Config.ln_tensor_gen _ nt float -> Config.ln_tensor_gen _ nt A with
             | Some LN
             | Datatypes.None
               => fun x => x
             end). Defined.
      Definition coer_blocks_params
        := List.map
             (fun '((W_Q, W_K, W_V, W_O,
                      b_Q, b_K, b_V, b_O,
                      ln1_w, ln1_b) : cfg.block_params_type float)
              => ((W_Q:tensor _ A), (W_K:tensor _ A), (W_V:tensor _ A), (W_O:tensor _ A),
                   (b_Q:tensor _ A), (b_K:tensor _ A), (b_V:tensor _ A), (b_O:tensor _ A),
                   coer_ln_tensor ln1_w, coer_ln_tensor ln1_b)).
Definition logits (tokens : tensor s IndexType) : tensor (s ::' cfg.d_vocab_out) A. exact (HookedTransformer.logits
             (A:=A) (n_ctx:=cfg.n_ctx) (normalization_type:=cfg.normalization_type) cfg.eps
             cfg.W_E cfg.W_pos
             (coer_blocks_params cfg.blocks_params)
             (coer_ln_tensor cfg.ln_final_w) (coer_ln_tensor cfg.ln_final_b)
             cfg.W_U cfg.b_U
             tokens). Defined.
    End __.
  End HookedTransformer.

  
  Notation logits_all_tokens
    := (@HookedTransformer.logits 1 [Uint63.of_Z (Z.of_N (@pow N N N N_has_pow cfg.d_vocab cfg.n_ctx))] (of_Z (Z.of_N cfg.n_ctx)) float (@coer_refl float) coer_Z_float float_has_add float_has_sub float_has_mul float_has_div float_has_max float_has_sqrt float_has_exp true (@all_tokens true)).

  Definition logits_all_tokens_concrete : PArray.concrete_tensor _ float
    := PArray.concretize logits_all_tokens.
    Ltac mkApp f x :=
      lazymatch f with
      | fun y => ?f => constr:(match x with y => f end)
      end.

    Ltac set_step _ :=
      match goal with
      | [ H := context G[let s : ?T := ?v in @?f s] |- _ ]
        => lazymatch goal with
           | [ s' := v |- _ ]
             => let fs := mkApp f s' in
                let G' := context G[fs] in
                change G' in (value of H)
           | _
             => let s' := fresh s in
                pose v as s';
                let fs := mkApp f s' in
                let G' := context G[fs] in
                change G' in (value of H)
           end;
           cbv beta iota in H
      | [ H := context G[let s : ?T := ?v in _] |- _ ]
        => assert_fails is_var v;
           lazymatch goal with
           | [ s' := v |- _ ]
             => change v with s' in (value of H)
           | _
             => let s' := fresh s in
                pose v as s';
                change v with s' in (value of H)
           end;
           cbv beta iota in H
      | [ |- context G[let s : ?T := ?v in @?f s] ]
        => lazymatch goal with
           | [ s' := v |- _ ]
             => let fs := mkApp f s' in
                let G' := context G[fs] in
                change G'
           | _
             => let s' := fresh s in
                pose v as s';
                let fs := mkApp f s' in
                let G' := context G[fs] in
                change G'
           end;
           cbv beta iota
      | [ |- context G[let s : ?T := ?v in _] ]
        => assert_fails is_var v;
           lazymatch goal with
           | [ s' := v |- _ ]
             => change v with s'
           | _
             => let s' := fresh s in
                pose v as s';
                change v with s'
           end;
           cbv beta iota
      end.
    Ltac subst_cleanup _ :=
      repeat match goal with
        | [ H := ?v |- _ ] => is_var v; subst H
        | [ H := ?x, H' := ?y |- _ ] => constr_eq x y; change H' with H in *; clear H'
        end.
    Ltac lift_lets _ := repeat set_step (); subst_cleanup ().

    Ltac set_checkpoint _ :=
      repeat match goal with
        | [ H := context G[?x] |- _ ]
          => lazymatch x with PArray.checkpoint _ => idtac | PArray.maybe_checkpoint _ => idtac end;
             lazymatch (eval cbv delta [H] in H) with
             | x => fail
             | _ => idtac
             end;
             let x' := fresh "t" in
             pose x as x';
             let G' := context G[x'] in
             change G' in (value of H)
        | [ |- context G[?x] ]
          => lazymatch x with PArray.checkpoint _ => idtac | PArray.maybe_checkpoint _ => idtac end;
             let x' := fresh "t" in
             pose x as x';
             let G' := context G[x'] in
             change G'
        end.

    Ltac subst_local_cleanup _ :=
      repeat match goal with
        | [ H := [ _ ] : ?T |- _ ]
          => lazymatch T with
             | Shape _ => idtac
             | forall b, Shape _ => idtac
             | Slice.Concrete.Slice _ => idtac
             | IndexType => idtac
             | Slice.Slice _ => idtac
             | PolymorphicOption.option IndexType => idtac
             | PolymorphicOption.option int => idtac
             end;
             subst H
        | [ H := ?v |- _ ]
          => lazymatch v with
             | fun f x => f x => idtac
             | fun x => x => idtac
             | _ => first [ is_uint63 v | is_float v ]
             end;
             subst H
        | [ H := [ fun x => coer x ] : float -> float |- _ ] => cbv in H; subst H
        | [ H := [ coer point ] : float |- _ ] => cbv in H; subst H
        | [ H := [ coer_Z_float _ ] : float |- _ ] => cbv in H; subst H
        | [ H := [ _ ] : ?T |- _ ]
          => lazymatch T with
             | has_one int => idtac
             end;
             cbv in H; subst H
        end;
      cbv beta iota in *.

    Ltac reduce _ :=
      cbv beta iota delta [
          repeat repeat' reduce_axis_m1 map map' reduce_axis_m1' reshape_app_combine broadcast broadcast' reshape_app_combine' RawIndex.uncurry_radd RawIndex.split_radd reshape_snoc_split reshape_app_split reshape_app_split' RawIndex.curry_radd RawIndex.combine_radd RawIndex.hd RawIndex.tl
            adjust_index_for
            Nat.radd
            Classes.sqrt Classes.add Classes.sub Classes.opp Classes.mul Classes.div Classes.sqr Classes.one Classes.zero Classes.exp Classes.eqb Classes.neqb Classes.ltb Classes.leb Classes.matmul
            bool_has_one bool_has_zero bool_has_eqb
            int_has_one Uint63.int_has_ltb PrimInt63.ltb
            Sint63.max Sint63.int_has_leb lesb
            has_default_max_leb
            lift_coer_has_zero lift_coer_has_one
            Z_has_zero Z_has_one
            float_has_zero float_has_one
            coer_refl coer_tensor
            int_has_add
            Tensor.get Tensor.raw_get Slicing.SliceIndex.SliceIndexType.slice Slice.invert_index Slice.concretize PolymorphicOption.Option.sequence_return Slice.step Slice.start Slice.stop Slice.Concrete.length Slicing.SliceIndex.slice Slicing.FancyIndex.slice Slicing.FancyIndex.slice_ Slicing.FancyIndex.broadcast Slicing.FancyIndex.FancyIndexType.broadcast Slice.Concrete.normalize Slice.Concrete.step Slice.Concrete.stop Slice.Concrete.start Slicing.broadcast_one_index'' Slicing.broadcast_one_index'
            Slice.Concrete.step Slice.Concrete.stop Slice.Concrete.base_len
            Slicing.inject_int
            RawIndex.snoc RawIndex.nil
            map_dep map2 map2' map3
            ones tril to_bool
            Shape.tl Shape.hd Shape.snoc Shape.nil
            item int_has_eqb raw_get Shape.broadcast2 Shape.map2
        ] in *;
      cbn beta iota delta [fst snd Primitive.fst Primitive.snd] in *.
    Ltac do_red _ :=
      reduce ();
      lift_lets (); set_checkpoint (); subst_local_cleanup ().
    

    Ltac red_normalization_type_layers _ :=
      cbv beta iota delta [logits_all_tokens_concrete logits_all_tokens HookedTransformer.coer_blocks_params] in *;
      lift_lets (); set_checkpoint ().

    Ltac red_early_layers _ :=
      cbv beta iota delta [HookedTransformer.HookedTransformer.logits HookedTransformer.Unembed.forward HookedTransformer.HookedTransformer.resid_postembed HookedTransformer.HookedTransformer.pos_embed HookedTransformer.HookedTransformer.embed HookedTransformer.Embed.forward HookedTransformer.PosEmbed.forward HookedTransformer.resid_postembed all_tokens] in *;
      lift_lets (); set_checkpoint ().
    Ltac red_blocks_layers_1 _ :=
      cbv beta iota delta [HookedTransformer.HookedTransformer.blocks_cps HookedTransformer.HookedTransformer.blocks] in *;
      lift_lets (); set_checkpoint ().

    Ltac red_blocks_layers_2 _ :=
      cbv beta iota delta [TransformerBlock.ln1 LayerNorm.forward TransformerBlock.query_input TransformerBlock.key_input TransformerBlock.value_input TransformerBlock.add_head_dimension LayerNorm.scale LayerNorm.rescale LayerNorm.linpart LayerNorm.postrescale] in *;
      lift_lets (); set_checkpoint (); do_red ().
    Ltac red_blocks_layers_3 _ :=
      cbv beta iota delta [Attention.attn_out Attention.z Attention.v Attention.pattern] in *;
      lift_lets (); set_checkpoint (); do_red ().
    Ltac red_blocks_layers_4 _ :=
      cbv beta iota delta [HookedTransformer.Attention.masked_attn_scores HookedTransformer.Attention.attn_scores Attention.einsum_input Attention.q Attention.k] in *;
      lift_lets (); set_checkpoint (); do_red ().
    Ltac red_blocks_layers_5 _ :=
      cbv [Attention.apply_causal_mask] in *;
      repeat (cbv beta iota zeta in *; do_red ()).
    Ltac red_blocks_layers_6 _ :=
      cbv beta iota delta [softmax_dim_m1] in *;
      lift_lets (); do_red ().
    Ltac red_ops _ :=
      cbv beta iota delta [Bool.where_ where_
                             tensor_add tensor_sub tensor_mul tensor_div_by tensor_sqrt tensor_matmul diagonal mm
                             float_has_add float_has_sub float_has_mul float_has_div float_has_exp float_has_sqrt
                             coer coer_Z_float] in *;
      do_red ().
    Ltac red_sum _ :=
      cbv [Wf_Uint63.Reduction.sum Wf_Uint63.map_reduce Wf_Uint63.for_loop_lt Classes.eqb PrimInt63.eqb Monad.bind Wf_Uint63.get Wf_Uint63.LoopBody_Monad Wf_Uint63.run_body Wf_Uint63.bind Wf_Uint63.set Wf_Uint63.update Wf_Uint63.Reduction.mean Classes.int_div Uint63.int_has_int_div Classes.div coer coer_Z_float Classes.sub int_has_sub] in *.

    Ltac red_late_layers_1 _ :=
      cbv beta iota delta [HookedTransformer.HookedTransformer.ln_final HookedTransformer.HookedTransformer.unembed LayerNorm.forward HookedTransformer.Unembed.forward Unembed.forward] in *;
      lift_lets (); set_checkpoint ().
    Ltac red_late_layers_2 _ :=
      cbv beta iota delta [LayerNorm.linpart LayerNorm.scale LayerNorm.rescale LayerNorm.postrescale] in *;
      lift_lets (); set_checkpoint (); do_red ().

  Derive logits_all_tokens_concrete_opt
    SuchThat (logits_all_tokens_concrete_opt = logits_all_tokens_concrete)
    As logits_all_tokens_concrete_opt_eq.
  Proof.
    Unshelve.
    2:{
 pose proof cfg.blocks_params as blocks_params.
        pose proof cfg.ln_final_w as ln_final_w.
        pose proof cfg.ln_final_b as ln_final_b.
        destruct cfg.normalization_type as [nt|]; [ destruct nt | ].
        all: shelve.
}
    red_normalization_type_layers ().
    subst blocks_params ln_final_b ln_final_w.
    set (blocks_params := cfg.blocks_params) in *.
    set (ln_final_w := cfg.ln_final_w) in *.
    set (ln_final_b := cfg.ln_final_b) in *.
    clearbody blocks_params ln_final_w ln_final_b.
    assert_succeeds destruct cfg.normalization_type.
    cbv beta zeta in *.
    red_early_layers ().
    red_blocks_layers_1 ().
    subst_local_cleanup ().
    rewrite List.firstn_all, List.map_map.
    lazymatch goal with
    | [ |- _ = ?concretize (List.fold_right ?k ?f ?ls ?resid) ]
      => let f' := open_constr:(_) in
         let ls' := open_constr:(_) in
         let Hf := fresh in
         let Hls := fresh in
         let f'' := fresh in
         pose f' as f'';
         assert (Hf : forall x, f'' x = f x /\ f'' = f);
         [ subst f'' | replace f with f''; [ subst f'' | clearbody f''; clear -Hf; abstract apply Hf, broadcast', point ]  ];
         [ | replace ls with ls'
         | .. ]
    end.
    3:{
 repeat match goal with H : _ |- _ => clear H end.
        instantiate (1:=ltac:(destruct cfg.normalization_type as [nt|]; [ destruct nt | ])).
        destruct cfg.normalization_type as [nt|]; [ destruct nt | ].
        all: cbv beta iota zeta; subst_local_cleanup ().
        all: cbv beta iota delta [TransformerBlock.attn_only_out]; lift_lets (); set_checkpoint ().
        all: match goal with
             | [ |- _ = List.map ?f _ ]
               => let f' := open_constr:(_) in
                  let f'' := fresh in
                  pose f' as f'';
                  let H := fresh in
                  assert (H : forall x y, f'' x y = f x y /\ f'' = f);
                  [ subst f''; intros ??
                  | replace f with f''; [ subst f''; shelve | clearbody f''; clear -H; shelve ] ]
             end.
        all: lift_lets (); set_checkpoint ().
        all: red_blocks_layers_2 ().
        all: red_blocks_layers_3 ().
        all: red_blocks_layers_4 ().
        all: red_blocks_layers_5 ().
        all: red_blocks_layers_6 ().
        all: red_ops ().
        all: red_sum ().
        all: clear_all.
        all: repeat lazymatch goal with
               | [ H := ?x |- _ ]
                 => revert H;
                    lazymatch goal with
                    | [ |- let H := ?x in ?lhs = ?rhs /\ ?lhs' = ?rhs' ]
                      => change (lhs = (let H := x in rhs) /\ lhs' = (let H := x in rhs'))
                    end
               end.
        all: lazymatch goal with |- ?e ?x ?y = _ /\ _ => revert x y end.
        Unshelve.
        all: shelve_unifiable.
        all: lazymatch goal with
             | [ |- forall x y, ?lhs x y = @?rhs x y /\ ?lhs = _ ]
               => change (forall x y, lhs x y = rhs x y /\ lhs = rhs); instantiate (1:=rhs); split; abstract reflexivity
             | _ => idtac
             end.
        all: cbv beta iota.
        all: repeat match goal with H : _ |- ?ev = _ => is_evar ev; clear H end.
        all: lazymatch goal with
             | [ |- ?ev = List.map _ _ ]
               => is_evar ev;
                  let rhs := lazymatch goal with |- _ = ?v => v end in
                  instantiate (1:=rhs); abstract reflexivity
             | [ H : forall x y, _ = _ /\ _ = _ |- _ = _ ] => abstract (apply H; repeat split; apply broadcast'; exact point)
             | _ => idtac
             end.
        shelve.
}
    {
 repeat match goal with H : _ |- _ => clear H end.
      instantiate (1:=ltac:(destruct cfg.normalization_type as [nt|]; [ destruct nt | ])).
      destruct cfg.normalization_type as [nt|]; [ destruct nt | ].
      all: intros.
      all: lift_lets (); subst_local_cleanup ().
      all: repeat match goal with H := Some _ |- _ => subst H end.
      all: repeat match goal with H := None |- _ => subst H end.
      all: cbv beta iota zeta.
      all: do_red ().
      all: red_late_layers_1 ().
      all: red_late_layers_2 ().
      all: red_ops ().
      all: red_sum ().
      all: do_red ().
      all: lazymatch goal with
           | [ |- context[Definitions.PrimFloat.of_Z ?z] ]
             => pose (Definitions.PrimFloat.of_Z z) as z';
                move z' at top;
                repeat match goal with
                  | [ H := context G[Definitions.PrimFloat.of_Z z] |- _ ]
                    => let G' := context G[z'] in
                       change G' in (value of H)
                  | [ |- context G[Definitions.PrimFloat.of_Z z] ]
                    => let G' := context G[z'] in
                       change G'
                  end
           | _ => idtac
           end.
      all: clear_all.
      all: repeat lazymatch goal with
             | [ H := ?x |- _ ]
               => revert H;
                  lazymatch goal with
                  | [ |- let H := ?x in ?lhs = ?rhs /\ ?lhs' = ?rhs' ]
                    => change (lhs = (let H := x in rhs) /\ lhs' = (let H := x in rhs'))
                  end
             end.
      all: lazymatch goal with |- ?e ?x = _ /\ _ => revert x end.
      Unshelve.
      all: shelve_unifiable.
      all: lazymatch goal with
           | [ |- forall x, ?lhs x = @?rhs x /\ ?lhs = _ ]
             => change (forall x, lhs x = rhs x /\ lhs = rhs); instantiate (1:=rhs); split; abstract reflexivity
           | _ => idtac
           end.
      all: cbv beta iota.
      all: shelve.
}
    all: cbv beta.
    all: do_red ().
    all: clear_all.
    cbv beta iota zeta in embed, pos_embed.
    destruct cfg.normalization_type as [nt|]; [ destruct nt | ].
    all: repeat match goal with H := Some _ |- _ => subst H end.
    all: repeat match goal with H := None |- _ => subst H end.
    all: cbv beta iota in *.
    all: red_ops (); do_red ().
    all: try subst logits_all_tokens_concrete_opt.
    all: repeat lazymatch goal with
           | [ H := ?x |- _ ]
             => revert H;
                lazymatch goal with
                | [ |- let H := ?x in ?lhs = ?rhs ]
                  => change (lhs = (let H := x in rhs))
                end
           end.
    all: lazymatch goal with
         | [ |- ?ev = ?v ]
           => tryif is_evar ev
             then instantiate (1:=v); abstract reflexivity
             else idtac
         end.
  Qed.
🛠️ Intermediate Coq File (useful for debugging if minimization did not go as far as you wanted)
🛠️ 📜 Intermediate Coq File log (useful for debugging if minimization did not go as far as you wanted)
📜 Build Log (contains the Coq error message) (truncated to last 8.0KiB; full 729KiB file on GitHub Actions Artifacts under build.log)
       refs/pull/9869/head  -> origin/pr/9869
 * [new ref]               refs/pull/987/head   -> origin/pr/987
 * [new ref]               refs/pull/9870/head  -> origin/pr/9870
 * [new ref]               refs/pull/9871/head  -> origin/pr/9871
 * [new ref]               refs/pull/9872/head  -> origin/pr/9872
 * [new ref]               refs/pull/9873/head  -> origin/pr/9873
 * [new ref]               refs/pull/9874/head  -> origin/pr/9874
 * [new ref]               refs/pull/9875/head  -> origin/pr/9875
 * [new ref]               refs/pull/9876/head  -> origin/pr/9876
 * [new ref]               refs/pull/9878/head  -> origin/pr/9878
 * [new ref]               refs/pull/988/head   -> origin/pr/988
 * [new ref]               refs/pull/9880/head  -> origin/pr/9880
 * [new ref]               refs/pull/9881/head  -> origin/pr/9881
 * [new ref]               refs/pull/9882/head  -> origin/pr/9882
 * [new ref]               refs/pull/9883/head  -> origin/pr/9883
 * [new ref]               refs/pull/9884/head  -> origin/pr/9884
 * [new ref]               refs/pull/9887/head  -> origin/pr/9887
 * [new ref]               refs/pull/9889/head  -> origin/pr/9889
 * [new ref]               refs/pull/989/head   -> origin/pr/989
 * [new ref]               refs/pull/9890/head  -> origin/pr/9890
 * [new ref]               refs/pull/9891/head  -> origin/pr/9891
 * [new ref]               refs/pull/9895/head  -> origin/pr/9895
 * [new ref]               refs/pull/9896/head  -> origin/pr/9896
 * [new ref]               refs/pull/9897/head  -> origin/pr/9897
 * [new ref]               refs/pull/9898/head  -> origin/pr/9898
 * [new ref]               refs/pull/99/head    -> origin/pr/99
 * [new ref]               refs/pull/990/head   -> origin/pr/990
 * [new ref]               refs/pull/9900/head  -> origin/pr/9900
 * [new ref]               refs/pull/9901/head  -> origin/pr/9901
 * [new ref]               refs/pull/9903/head  -> origin/pr/9903
 * [new ref]               refs/pull/9904/head  -> origin/pr/9904
 * [new ref]               refs/pull/9905/head  -> origin/pr/9905
 * [new ref]               refs/pull/9906/head  -> origin/pr/9906
 * [new ref]               refs/pull/9908/head  -> origin/pr/9908
 * [new ref]               refs/pull/9909/head  -> origin/pr/9909
 * [new ref]               refs/pull/991/head   -> origin/pr/991
 * [new ref]               refs/pull/9910/head  -> origin/pr/9910
 * [new ref]               refs/pull/9914/head  -> origin/pr/9914
 * [new ref]               refs/pull/9915/head  -> origin/pr/9915
 * [new ref]               refs/pull/9917/head  -> origin/pr/9917
 * [new ref]               refs/pull/9918/head  -> origin/pr/9918
 * [new ref]               refs/pull/992/head   -> origin/pr/992
 * [new ref]               refs/pull/9923/head  -> origin/pr/9923
 * [new ref]               refs/pull/9924/head  -> origin/pr/9924
 * [new ref]               refs/pull/9925/head  -> origin/pr/9925
 * [new ref]               refs/pull/9926/head  -> origin/pr/9926
 * [new ref]               refs/pull/9927/head  -> origin/pr/9927
 * [new ref]               refs/pull/9928/head  -> origin/pr/9928
 * [new ref]               refs/pull/993/head   -> origin/pr/993
 * [new ref]               refs/pull/9931/head  -> origin/pr/9931
 * [new ref]               refs/pull/9933/head  -> origin/pr/9933
 * [new ref]               refs/pull/9935/head  -> origin/pr/9935
 * [new ref]               refs/pull/9938/head  -> origin/pr/9938
 * [new ref]               refs/pull/9939/head  -> origin/pr/9939
 * [new ref]               refs/pull/994/head   -> origin/pr/994
 * [new ref]               refs/pull/9941/head  -> origin/pr/9941
 * [new ref]               refs/pull/9943/head  -> origin/pr/9943
 * [new ref]               refs/pull/9946/head  -> origin/pr/9946
 * [new ref]               refs/pull/9947/head  -> origin/pr/9947
 * [new ref]               refs/pull/9949/head  -> origin/pr/9949
 * [new ref]               refs/pull/995/head   -> origin/pr/995
 * [new ref]               refs/pull/9952/head  -> origin/pr/9952
 * [new ref]               refs/pull/9953/head  -> origin/pr/9953
 * [new ref]               refs/pull/9957/head  -> origin/pr/9957
 * [new ref]               refs/pull/9959/head  -> origin/pr/9959
 * [new ref]               refs/pull/996/head   -> origin/pr/996
 * [new ref]               refs/pull/9961/head  -> origin/pr/9961
 * [new ref]               refs/pull/9962/head  -> origin/pr/9962
 * [new ref]               refs/pull/9963/head  -> origin/pr/9963
 * [new ref]               refs/pull/9964/head  -> origin/pr/9964
 * [new ref]               refs/pull/9966/head  -> origin/pr/9966
 * [new ref]               refs/pull/9968/head  -> origin/pr/9968
 * [new ref]               refs/pull/997/head   -> origin/pr/997
 * [new ref]               refs/pull/9972/head  -> origin/pr/9972
 * [new ref]               refs/pull/9973/head  -> origin/pr/9973
 * [new ref]               refs/pull/9977/head  -> origin/pr/9977
 * [new ref]               refs/pull/9978/head  -> origin/pr/9978
 * [new ref]               refs/pull/998/head   -> origin/pr/998
 * [new ref]               refs/pull/9980/head  -> origin/pr/9980
 * [new ref]               refs/pull/9981/head  -> origin/pr/9981
 * [new ref]               refs/pull/9982/head  -> origin/pr/9982
 * [new ref]               refs/pull/9983/head  -> origin/pr/9983
 * [new ref]               refs/pull/9984/head  -> origin/pr/9984
 * [new ref]               refs/pull/9985/head  -> origin/pr/9985
 * [new ref]               refs/pull/9987/head  -> origin/pr/9987
 * [new ref]               refs/pull/9988/head  -> origin/pr/9988
 * [new ref]               refs/pull/9989/head  -> origin/pr/9989
 * [new ref]               refs/pull/999/head   -> origin/pr/999
 * [new ref]               refs/pull/9990/head  -> origin/pr/9990
 * [new ref]               refs/pull/9995/head  -> origin/pr/9995
 * [new ref]               refs/pull/9996/head  -> origin/pr/9996
 * [new ref]               refs/pull/9997/head  -> origin/pr/9997
 * [new ref]               refs/pull/9998/head  -> origin/pr/9998
 * [new ref]               refs/pull/9999/head  -> origin/pr/9999
++ cp -a coq coq-failing
++ cp -a coq coq-passing
++ printf '::endgroup::\n'
::endgroup::
++ printf '::group::df -h\n'
::group::df -h
++ df -h
Filesystem      Size  Used Avail Use% Mounted on
overlay         145G   58G   87G  41% /
tmpfs            64M     0   64M   0% /dev
shm              64M     0   64M   0% /dev/shm
/dev/root       145G   58G   87G  41% /usr/sbin/docker-init
tmpfs           7.9G     0  7.9G   0% /proc/acpi
tmpfs           7.9G     0  7.9G   0% /proc/scsi
tmpfs           7.9G     0  7.9G   0% /sys/firmware
++ printf '::endgroup::\n'
::endgroup::
++ printf '::group::download failing artifacts @ %s %s\n' 2c72b4df2e27a9b150e9385733ac5b6edd7f3ec7 'https://gitlab.inria.fr/coq/coq/-/jobs/6976756/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974951/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974986/artifacts/download'
::group::download failing artifacts @ 2c72b4df2e27a9b150e9385733ac5b6edd7f3ec7 https://gitlab.inria.fr/coq/coq/-/jobs/6976756/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974951/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974986/artifacts/download
++ printf '::warning::download failing artifacts @ %s %s\n' 2c72b4df2e27a9b150e9385733ac5b6edd7f3ec7 'https://gitlab.inria.fr/coq/coq/-/jobs/6976756/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974951/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974986/artifacts/download'
::warning::download failing artifacts @ 2c72b4df2e27a9b150e9385733ac5b6edd7f3ec7 https://gitlab.inria.fr/coq/coq/-/jobs/6976756/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974951/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974986/artifacts/download
++ pushd coq-failing
/github/workspace/builds/coq/coq-failing /github/workspace/builds/coq /github/workspace
++ mkdir -p _build_ci
++ ln -s _build_ci saved_build_ci
++ git checkout 2c72b4df2e27a9b150e9385733ac5b6edd7f3ec7
fatal: reference is not a tree: 2c72b4df2e27a9b150e9385733ac5b6edd7f3ec7
📜 🔎 Minimization Log (truncated to last 8.0KiB; full 743KiB file on GitHub Actions Artifacts under bug.log)
       refs/pull/9869/head  -> origin/pr/9869
 * [new ref]               refs/pull/987/head   -> origin/pr/987
 * [new ref]               refs/pull/9870/head  -> origin/pr/9870
 * [new ref]               refs/pull/9871/head  -> origin/pr/9871
 * [new ref]               refs/pull/9872/head  -> origin/pr/9872
 * [new ref]               refs/pull/9873/head  -> origin/pr/9873
 * [new ref]               refs/pull/9874/head  -> origin/pr/9874
 * [new ref]               refs/pull/9875/head  -> origin/pr/9875
 * [new ref]               refs/pull/9876/head  -> origin/pr/9876
 * [new ref]               refs/pull/9878/head  -> origin/pr/9878
 * [new ref]               refs/pull/988/head   -> origin/pr/988
 * [new ref]               refs/pull/9880/head  -> origin/pr/9880
 * [new ref]               refs/pull/9881/head  -> origin/pr/9881
 * [new ref]               refs/pull/9882/head  -> origin/pr/9882
 * [new ref]               refs/pull/9883/head  -> origin/pr/9883
 * [new ref]               refs/pull/9884/head  -> origin/pr/9884
 * [new ref]               refs/pull/9887/head  -> origin/pr/9887
 * [new ref]               refs/pull/9889/head  -> origin/pr/9889
 * [new ref]               refs/pull/989/head   -> origin/pr/989
 * [new ref]               refs/pull/9890/head  -> origin/pr/9890
 * [new ref]               refs/pull/9891/head  -> origin/pr/9891
 * [new ref]               refs/pull/9895/head  -> origin/pr/9895
 * [new ref]               refs/pull/9896/head  -> origin/pr/9896
 * [new ref]               refs/pull/9897/head  -> origin/pr/9897
 * [new ref]               refs/pull/9898/head  -> origin/pr/9898
 * [new ref]               refs/pull/99/head    -> origin/pr/99
 * [new ref]               refs/pull/990/head   -> origin/pr/990
 * [new ref]               refs/pull/9900/head  -> origin/pr/9900
 * [new ref]               refs/pull/9901/head  -> origin/pr/9901
 * [new ref]               refs/pull/9903/head  -> origin/pr/9903
 * [new ref]               refs/pull/9904/head  -> origin/pr/9904
 * [new ref]               refs/pull/9905/head  -> origin/pr/9905
 * [new ref]               refs/pull/9906/head  -> origin/pr/9906
 * [new ref]               refs/pull/9908/head  -> origin/pr/9908
 * [new ref]               refs/pull/9909/head  -> origin/pr/9909
 * [new ref]               refs/pull/991/head   -> origin/pr/991
 * [new ref]               refs/pull/9910/head  -> origin/pr/9910
 * [new ref]               refs/pull/9914/head  -> origin/pr/9914
 * [new ref]               refs/pull/9915/head  -> origin/pr/9915
 * [new ref]               refs/pull/9917/head  -> origin/pr/9917
 * [new ref]               refs/pull/9918/head  -> origin/pr/9918
 * [new ref]               refs/pull/992/head   -> origin/pr/992
 * [new ref]               refs/pull/9923/head  -> origin/pr/9923
 * [new ref]               refs/pull/9924/head  -> origin/pr/9924
 * [new ref]               refs/pull/9925/head  -> origin/pr/9925
 * [new ref]               refs/pull/9926/head  -> origin/pr/9926
 * [new ref]               refs/pull/9927/head  -> origin/pr/9927
 * [new ref]               refs/pull/9928/head  -> origin/pr/9928
 * [new ref]               refs/pull/993/head   -> origin/pr/993
 * [new ref]               refs/pull/9931/head  -> origin/pr/9931
 * [new ref]               refs/pull/9933/head  -> origin/pr/9933
 * [new ref]               refs/pull/9935/head  -> origin/pr/9935
 * [new ref]               refs/pull/9938/head  -> origin/pr/9938
 * [new ref]               refs/pull/9939/head  -> origin/pr/9939
 * [new ref]               refs/pull/994/head   -> origin/pr/994
 * [new ref]               refs/pull/9941/head  -> origin/pr/9941
 * [new ref]               refs/pull/9943/head  -> origin/pr/9943
 * [new ref]               refs/pull/9946/head  -> origin/pr/9946
 * [new ref]               refs/pull/9947/head  -> origin/pr/9947
 * [new ref]               refs/pull/9949/head  -> origin/pr/9949
 * [new ref]               refs/pull/995/head   -> origin/pr/995
 * [new ref]               refs/pull/9952/head  -> origin/pr/9952
 * [new ref]               refs/pull/9953/head  -> origin/pr/9953
 * [new ref]               refs/pull/9957/head  -> origin/pr/9957
 * [new ref]               refs/pull/9959/head  -> origin/pr/9959
 * [new ref]               refs/pull/996/head   -> origin/pr/996
 * [new ref]               refs/pull/9961/head  -> origin/pr/9961
 * [new ref]               refs/pull/9962/head  -> origin/pr/9962
 * [new ref]               refs/pull/9963/head  -> origin/pr/9963
 * [new ref]               refs/pull/9964/head  -> origin/pr/9964
 * [new ref]               refs/pull/9966/head  -> origin/pr/9966
 * [new ref]               refs/pull/9968/head  -> origin/pr/9968
 * [new ref]               refs/pull/997/head   -> origin/pr/997
 * [new ref]               refs/pull/9972/head  -> origin/pr/9972
 * [new ref]               refs/pull/9973/head  -> origin/pr/9973
 * [new ref]               refs/pull/9977/head  -> origin/pr/9977
 * [new ref]               refs/pull/9978/head  -> origin/pr/9978
 * [new ref]               refs/pull/998/head   -> origin/pr/998
 * [new ref]               refs/pull/9980/head  -> origin/pr/9980
 * [new ref]               refs/pull/9981/head  -> origin/pr/9981
 * [new ref]               refs/pull/9982/head  -> origin/pr/9982
 * [new ref]               refs/pull/9983/head  -> origin/pr/9983
 * [new ref]               refs/pull/9984/head  -> origin/pr/9984
 * [new ref]               refs/pull/9985/head  -> origin/pr/9985
 * [new ref]               refs/pull/9987/head  -> origin/pr/9987
 * [new ref]               refs/pull/9988/head  -> origin/pr/9988
 * [new ref]               refs/pull/9989/head  -> origin/pr/9989
 * [new ref]               refs/pull/999/head   -> origin/pr/999
 * [new ref]               refs/pull/9990/head  -> origin/pr/9990
 * [new ref]               refs/pull/9995/head  -> origin/pr/9995
 * [new ref]               refs/pull/9996/head  -> origin/pr/9996
 * [new ref]               refs/pull/9997/head  -> origin/pr/9997
 * [new ref]               refs/pull/9998/head  -> origin/pr/9998
 * [new ref]               refs/pull/9999/head  -> origin/pr/9999
++ cp -a coq coq-failing
++ cp -a coq coq-passing
++ printf '::endgroup::\n'
::endgroup::
++ printf '::group::df -h\n'
::group::df -h
++ df -h
Filesystem      Size  Used Avail Use% Mounted on
overlay         145G   58G   87G  41% /
tmpfs            64M     0   64M   0% /dev
shm              64M     0   64M   0% /dev/shm
/dev/root       145G   58G   87G  41% /usr/sbin/docker-init
tmpfs           7.9G     0  7.9G   0% /proc/acpi
tmpfs           7.9G     0  7.9G   0% /proc/scsi
tmpfs           7.9G     0  7.9G   0% /sys/firmware
++ printf '::endgroup::\n'
::endgroup::
++ printf '::group::download failing artifacts @ %s %s\n' 2c72b4df2e27a9b150e9385733ac5b6edd7f3ec7 'https://gitlab.inria.fr/coq/coq/-/jobs/6976756/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974951/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974986/artifacts/download'
::group::download failing artifacts @ 2c72b4df2e27a9b150e9385733ac5b6edd7f3ec7 https://gitlab.inria.fr/coq/coq/-/jobs/6976756/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974951/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974986/artifacts/download
++ printf '::warning::download failing artifacts @ %s %s\n' 2c72b4df2e27a9b150e9385733ac5b6edd7f3ec7 'https://gitlab.inria.fr/coq/coq/-/jobs/6976756/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974951/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974986/artifacts/download'
::warning::download failing artifacts @ 2c72b4df2e27a9b150e9385733ac5b6edd7f3ec7 https://gitlab.inria.fr/coq/coq/-/jobs/6976756/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974951/artifacts/download https://gitlab.inria.fr/coq/coq/-/jobs/6974986/artifacts/download
++ pushd coq-failing
/github/workspace/builds/coq/coq-failing /github/workspace/builds/coq /github/workspace
++ mkdir -p _build_ci
++ ln -s _build_ci saved_build_ci
++ git checkout 2c72b4df2e27a9b150e9385733ac5b6edd7f3ec7
fatal: reference is not a tree: 2c72b4df2e27a9b150e9385733ac5b6edd7f3ec7

If you have any comments on your experience of the minimizer, please share them in a reply (possibly tagging @JasonGross).
If you believe there's a bug in the bug minimizer, please report it on the bug minimizer issue tracker.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

needs: full CI The latest GitLab pipeline that ran was a light CI. Say "@coqbot run full ci" to get a full CI. needs: progress Work in progress: awaiting action from the author.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants