Xuan-Son Nguyen
1466621e73
llama : Support llama 4 text-only (#12791)
* llama4 conversion
* initial support, no chat template
* clean up a bit
* fix tokenizer conversion
* correct hparams
* try this
* fix shexp
* ffn_inp_normed
* chat template
* clean up model conversion
* add_bos
* add scale_before_ffn
* fix order
* weight_before_ffn
* llm_graph_input_attn_temp
* add chunk attn mask
* build_inp_attn_scale()
* add comment about ggml_repeat
* clarify comments
* fix build
2025-04-07 23:06:44 +02:00
..
2025-03-13 12:35:44 +02:00
2025-03-27 08:24:10 +02:00
2025-03-13 12:35:44 +02:00
2025-04-07 23:06:44 +02:00
2025-04-07 23:06:44 +02:00
2025-01-03 10:18:53 +02:00
2025-03-13 12:35:44 +02:00
2025-04-07 23:06:44 +02:00
2025-04-07 23:06:44 +02:00
2025-04-04 21:48:10 +03:00
2025-03-14 13:47:05 +01:00
2025-01-03 10:18:53 +02:00
2025-03-14 13:47:05 +01:00
2025-03-05 13:05:13 +00:00
2025-03-05 13:05:13 +00:00
2025-04-07 23:06:44 +02:00
2025-04-07 23:06:44 +02:00
2025-03-14 09:03:24 +02:00
2025-04-07 23:06:44 +02:00
2025-01-07 18:01:58 +01:00
2025-02-12 10:06:53 -04:00
2025-03-13 12:35:44 +02:00
2025-03-13 12:35:44 +02:00
2025-04-04 21:48:10 +03:00
2025-04-04 21:48:10 +03:00
2025-03-13 12:35:44 +02:00
2025-04-04 21:48:10 +03:00
2025-03-24 12:17:10 +02:00
2025-02-10 20:58:18 +02:00
2025-04-02 16:38:54 +03:00
2025-04-02 14:52:01 +02:00
2025-04-07 23:06:44 +02:00
2025-04-07 23:06:44 +02:00
2025-04-02 14:52:01 +02:00
2025-01-03 10:18:53 +02:00
2025-03-28 18:08:52 +01:00
2025-01-12 11:32:42 +02:00
2025-04-07 23:06:44 +02:00
2025-01-12 12:15:53 +02:00
2025-04-02 14:52:01 +02:00
2024-10-08 13:27:04 +02:00
2024-10-02 15:49:55 +02:00
2025-02-15 16:40:57 +02:00
2024-12-16 12:31:45 +02:00