The initial motivation of LaTeX compile times being slow is very interesting to me.
I use LaTeX as a tool for layout of books to print for hobby bookbinding and my current project - a 3 megabyte, 500k word beast of a novel - only takes around 10 seconds to compile.
I cant imagine what the friend of the author here had going on in his paper such that his compile times took such a hit. Required use of specific LaTeX libraries dictated by the journals he was submitting to that were written inefficiently? specific LaTeX features or packages that end up hitting significantly slower codepaths?
Makes me wonder if its not LaTeX itself that is the speed issue but instead the very maturity of the ecosystem that it has as advantage over Typst. It could entirely be possible that once Typst has the wide berth of features and functionality available through its ecosystem, that it would become just as easy to fall into compile time tarpits.
My experience is the same as OP: Typst is significantly faster than Latex. My book has all of table of contents, parts and chapters, figures, code samples, tables, images, and bibliography. These are all going to require multiple passes to layout. E.g. you cannot insert page numbers until you have laid out the table of contents, as it comes before other content. However you cannot construct the table of contents before you have processed the rest of the document. A typical novel won't have most of these, and so I think it will be substantially easier to layout.
Generally Typst looks like a significant improvment over LaTeX to me. The language is cleaner and easier to understand, and the first class scripting support is appealing. Its embeddability and templating features make it an interesting option for automated PDF generation (e.g. invoices) as well.
However its handling of introspection and convergence gives me a bad feeling.
Typst looks really promising, especially due to the fact that it had common templates (like the IEEE one) which produce content identical to LaTeX.
My biggest gripe with latex is the tooling. During my last paper, I ended up using a makefile which would usually work. When it didn’t work, running it twice would fix the issue. In the rarest cases, I had to run `git clean -xdf` and the next run would work.
I still have no idea what was going on, and most makefiles out there seem to be obscenely complex and simply parse the output and run the same commands again if a certain set of errors occurred.
> My biggest gripe with latex is the tooling. During my last paper, I ended up using a makefile which would usually work. When it didn’t work, running it twice would fix the issue. In the rarest cases, I had to run `git clean -xdf` and the next run would work.
I always feel like I’m doing something wrong when I have to deal with LaTeX and lose hours to fighting with the tooling. Even with a clean install on a new machine it feels like something fails to work.
The last time I had to change a document I had to go through what felt like 100 different search results of people with the same issue before I found one where there was a resolution and it was completely obscure. I tried to help out by reposting the answer to a couple other locations, but I was so exhausted that I swore off LaTeX for any future work unless absolutely unavoidable.
Absolutely not a perfect solution, and maybe you're already using it within your Makefiles, but for anyone who doesn't yet know about it there's Latexmk[1] which is supposed to automate all of this hassle. I think at least on Debian it's included with texlive-full. In addition it has some nice flags like `-outdir` which lets you send all the crazy LaTeX intermediate build/aux files to a separate directory that's easy to gitignore.
I think I used to understand this, but it's been a long time since I had to write any serious LaTeX, so I don't anymore. I found this snippet in my personal _quick-build-latex_ script from over a decade ago:
if [ -z "$(find . -name "*.bib" -print0)" ]; then
# Just two runs, to cover TOC building, etc.
pdflatex -interaction=nonstopmode "$SOURCE_FILE" && \
pdflatex -interaction=nonstopmode "$SOURCE_FILE"
else
pdflatex -interaction=nonstopmode "$SOURCE_FILE" && \
bibtex "$SOURCE_FILE" && \
pdflatex -interaction=nonstopmode "$SOURCE_FILE" && \
pdflatex -interaction=nonstopmode "$SOURCE_FILE"
fi
So I guess if you're using bibtex, then you need to run it three times, but otherwise only twice?
What do you mean by tooling? I've used LaTeX for decades to write books and papers and the combination with Emacs was flawless. The only major change for me was the transition from Bibtex to Biblatex.
I'm sticking with LaTeX, not as a fetish, but because journal/conferences still do not accept e.g. typst. Will they ever do? I don't know, depends on their willingness to integrate it into their toolchains I guess?
Yeah, that was my first thought. And it's not just about them accepting typst, but also whether they would provide a template using typst, like they currently do for latex. Using the conference/journal template to write the article saves a lot of time for both submitters and editors (who have to deal with hundreds, if not thousands of submissions).
There are already at least two publishers which accept Typst. So that "ever" part is already covered. But most still don't accept Typst and LaTeX is usually mandatory if the sources are required.
That is for sure my biggest concern with typst. I wrote a tool that can convert from typst to latex for final submissions, but it is a bit sketchy and at the moment won't handle math very well. https://gitlab.com/theZoq2/ttt
Typst looks good, but I'm actually going back to LaTeX but paired with Claude Code in VS Code.
I took a hiatus from LaTeX (got my PhD more than a decade ago). I used to know TikZ commands by heart, and I used to write sophisticated preambles (lots of \newcommand). I still remember LaTeX math notation (it's in my muscle memory, and it's used everywhere including in Markdown), but I'd forgotten all the other stuff.
Claude Code, amazingly, knows all that other stuff. I just tell it what I want and it gets 95% of the way there in 1-2 shots.
Not only that, it can figure out the error messages. The biggest pain in the neck with LaTeX is figuring out what went wrong. With Claude, that's not such a big issue.
I hate a lot of things about LaTeX (also wrote several theses in it, as well as research articles), but the math syntax definitely wasn't one of them. Why on earth would they change it?
One relatively optimistic prediction would be that a few will accept Typst, but latex export from Typst will gradually get more mature, until we end up with a charade where more people use other frontends like Quarto or Typst that output to latex rather than latex themselves for submission into journals - in certain fields. Somewhere after that time, Typst will break through and be generally accepted itself.
I tried typst a year ago, and I found it really nice to use compared to Latex. I even managed to make (or modify I don't remember) a small module to customize boxes, something I would not have even though of trying with latex.
I don't use latex anymore and I don't have a use case for typst, so I'm not currently using it, but I follow the advancements from time to time, and I have to disagree with the advisor.
Typst is perfectly fine for replacing latex in almost any place that doesn't require the latex source. The other case is because tthe ecosystem is much smaller so if you need a specific extension that does not exist or is not trivial to implement you'll be out of luck, and you'll be stuck with latex.
Glad to hear Typst has people doing serious work with it.
I’ve been able to avoid LaTeX. At uni, I went for org-mode -> LaTeX, which was OK except when my .emacs file was filling up with LaTeX stuff to make random stuff work. To be honest, that means I probably can’t even compile it again if I wanted to.
Typst has been awesome (always ran into LaTeX just being horribly inconsistent when layout stuff) when I’ve used it. Hope it continues.
| „[…] was a friend telling me his LaTeX thesis took 90 seconds to compile towards the end“
Sure, but in order to iterate you won’t have to compile the whole document but can just keep the chapter you are working on by structuring it with \includes
"LaTeX is not a word processor! Instead, LaTeX encourages authors not to worry too much about the appearance of their documents but to concentrate on getting the right content."
IMO, the only people that use LaTeX are people who are willing to trade the convenience and productivity of using a sane document authoring format for the warm and fuzzy feeling you get when you use an outdated piece of typesetting software that is a) hard to configure, b) hard to use and c) produces output for the least useful reading platform available (paged pdfs).
> IMO, the only people that use LaTeX are people who are willing to trade the convenience and productivity of using a sane document authoring format for the warm and fuzzy feeling [...]
I hope you are aware that literally all research in mathematics and computer science is typed up and published in LaTeX?
Alternatively, they're people who write documents in a field where LaTeX is the standard, they're not computer savvy enough to try to even look for something new that might be acceptable or might compile to LaTeX, and at any rate they want to focus more on their research than they do on changing the typesetting norms in their field.
(No shade on people who do decide to use alternatives, and Typst is great!)
What deters me from Typst is that latex math syntax is nowadays ubiquitous. You write $x^2=1$ and it renders in many places. Learning a new syntax for math expressions is simply not in my interests.
The threat is literally the opposite, it is very freeing to be able to write typst syntax because it's quicker and easier to write. But then you're cursed by the fact that every other place now uses latex math syntax by convention.
It is very fast to learn the Typst math syntax. It is easy and intuitive and usually less verbose than LaTeX. It should not be a difficult thing to learn for most people.
> If the entry is in English, and the style demands sentence case, convert to sentence-case and output
Nope: not possible to automatically determine which capitalised nouns are proper (and thus remain capitalised in sentence case) and which are common (and thus become uncapitalised).
This is in fact why it is better to store sentence case: it can be unambiguously converted to title case while the reverse is ambiguous. It’s not mere preference.
Tangential, do LLMs pick up new languages that have less internet discussion and which develop rapidly after knowledge cutoff dates? To naysayers, AIs are supposed to generate hands with 6 fingers and ossify language and framework versions.
Maybe if it's completely distinct. Else definitely no, unless, maybe, if the model is fine-tuned. Had a discussion about it with my dad whos work is developing in a non-mainstream SmallTalk dialect where it doesn't work at all.
I have so far not been able to get a major LLM to generate fully functional Typst code, no matter how much context I try to put into it. The models do not seem to currently understand Typst's concept of modes (code, markup and math) and especially in code mode suffers from heavy hallucination of syntax and/or semantics.
In thirty years LaTEX will still be open source and probably will be maintained.
Typst appears to be a mix of open source and closed source; the general model here tends to be neglecting the open source part and implementing critical features in the closed source portion. Which is to say, it's unlikely to live beyond the company itself.
Typst is fully open source licensed under Apache-2.0 license. It is not a mix of any kind. Don't confuse the web app with Typst engine. The web app is a similar service to Overleaf and that is closed source. It is not mandatory, you can use Typst fully on your local machine. The team tries to make money and cover development costs with the web app. But the actual typesetting engine is fully open source and free.
you are wrong. typst's lead dev has stated that an important goal is to have the CLI (which is open source) and web app behave identically, even refusing to implement such a basic feature as PDF embedding because, due to technical reasons, it is currently incompatible with this goal. [1]
typst, the project, is not by any means a "mix" of open and closed, even if typst, the company, is. indeed, the most thorough LSP implementation available (tinymist) is not only open source but a community project. for another funny example see typstify, a paid typst editor not affiliated with the company. [2]
I disagree. The web app editor is closed source, but much of what it provides is open source so editing is a similar (and imo better) experience locally. The typst compiler and LSP and everything you need to use it is open source.
Imo the situation is more like if overleaf were also the people who made the LaTeX project originally.
I think the only possible issue with the typst org dying (assuming after the full 1.0 version so it's mostly maintenance) is that packages are automatically downloaded from the typst site, but an open repo can trivially be made considering that the set of packages used is just from a open source git repo and the closed source site just hosts tar.gz files of the folders in the repo. Not a big deal I think.
That is a real concern, but I wouldn't say there are any critical features in the closed source portion. I wrote the whole thesis locally with only open source tools. One of the included papers was written in the cloud platform for collaboration.
It is a concern that there is a single company doing most of the development, but there is quite a bit of community involvement so I don't think it is an immediate concern
>In thirty years LaTEX will still be open source and probably will be maintained.
The latter is a genuine concern. Will it be maintained? I like LaTeX a lot, but would I want to maintain its internals? No. Could I? If I were paid handsomely, yes. Emphasis on handsomely.
Which leads to another worry: LaTeX itself may be OSS, but down the line it is possible that maintained forks will be controlled by big publishers paying maintainers to deal with the insanity of its internals. And we all know how lovely those publishers are (凸ಠ益ಠ)凸
On the flip side, new tools like Typst are trying to push the UX forward in ways that the LaTeX ecosystem often struggles with. I think it comes down to what risks you're comfortable with
I've used Typst to generate reports in multiple languages and it works pretty well for this! I just pass typst JSON with the report data and use it from there.
Especially in combination with file watching. Your script writes to the JSON file and the entire document and everything that depends on it updates automatically, often in less than a second.
Very cool! I ran into the multiple bibliography issue when attempting to typeset my grandmother's PHD thesis which I was able to rescue from the 5.25" floppies it was originally stored on. I was planning on waiting until they solved this officially to resume that side project, but might give Alexandria a shot!
I was on the typst train, particularly because its layout engine has some additional vertical control for long documents that latex lacks. However, just about when I was looking at moving over, LLM coding became good or at least good enough, and one area the current crop is bad at is doing layout in anything but latex. Not that they are good at latex, but they are terrible, terrible, terrible at typst. Really bad. Maybe in another year or six months!
I understand why people like using LLMs for coding, saves them having to think, but it is deeply frustrating to see it being such a crutch that some people cannot use new tools without it.
I suppose the issue is not new, many people didn't want to use new lanuages before because they couldn't copy snippets from the internet, but it was frustrating then too.
> and one area the current crop is bad at is doing layout in anything but latex. Not that they are good at latex, but they are terrible, terrible, terrible at typst
I'm surprised to hear that—I've been using GitHub Copilot with ConTeXt [0] since 2021, and it mostly works fairly well. And ConTeXt is much more obscure than Typst (but also much older, so maybe that gives it an advantage?).
The ecosystem issues and rough edges in bibliography handling don't surprise me, but the fact that you could script so much directly inside the doc is really appealing
Yes, it uses a very similar algorith as LaTeX. It also incorporates already some microtype features out of the box. So the typesetting quality is very good and easily comparable to LaTeX. Working with Typst is so much easier and faster than with LaTeX so you will be more productive. Many things can be done without resorting to external packages and scripting is a breeze compared to LaTeX.
Just try it out. It is free, open source and very easy to setup. Just install the extension Tinymist on VSCode, that is all you need.
I've never had a big complaint about LaTeX, it's easy to get into and the results are stunning if you use it minimally and with care. They only thing I've always been missing was a way to make it easier to have perfect register-true typesetting for books. This has to do with LaTeX's paragraph flow algorithm and lack of global optimization, or so I've been told.
Can Typst provide better register-true layout? That would be interesting to me.
The Typst compiler is completely open-source. I prefer my local copy of the Typst compiler and CLI to whatever LaTeX provides right now already, and there seems to be a still growing community that could keep the project going even after a malicious acquisition of some kind.
I have to agree that Typst source generally looks a lot less uglier than LaTeX. I considered writing stuff in Typst many times, but I couldn't master the courage to do so.
Nice debrief. I think tough some of the downsides the author mentions can be addressed relatively easily with quarto, which has embraces Typst since its early days as I recall. Especially the bibliography issue.
In addition to making it possible to write easily, TeXmacs is also based on a markup language. It demonstrates that a markup language and WYSIWYG writing can coexist efficiently.
Why do CS doctoral candidates have such a fascination with typesetting? I mean, be into whatever you’re into, I guess.
But as soon as someone starts talking about LaTEX and how they spent months on their macros, I think “another hapless victim has fallen into LaTEX’s trap.” It’s like an ant lion that feeds on procrastinating students.
I was a math major in undergrad, we care about typesetting so much because you really do not want to be stuck handwriting everything, but it's not easy to be faster typing than you are with handwriting when you're writing out rows and rows of equations. (Actually physics was generally a lot harder for me to keep up with while typing than math was.)
And when your life is revolving around classes or your thesis, the #1 most important thing to you in the world is how easily you can transfer your ideas to paper/digital format. It makes a lot of sense that people care a lot about the quality of their typesetting engine and exchange macro tips with each other (I got a lot of helpful advice from friends, and my default latex header was about 50% my own stuff and 50% copied from friends in my same major)
This is not limited to CS or Latex in any way. Plenty of students spend a lot of time fiddling with word, powerpoint, note taking systems, citation management (which is surprisingly horrible in MS word), Adobe software etc..
Obvious reasons:
- Your thesis is a major output of years of work. Of course you want it to look good.
- You might think it superficial, but if the presentation looks bad, many people (subconsciously) interpret this as a lack of care and attention. Just like an email with typos feels unprofessional even if the content is otherwise fine.
- Spending time on tooling feels productive even if it is not past a certain point.
- People that are into typesetting now have an excuse to spend time on it.
That said, in my experience people spent a few hours to learn "enough" latex several years ago and almost never write any macros. Simple reason: you work with other people and different journal templates, so the less custom code the better.
Time spent on typesetting produces immediately visible results (however minor). Actual research doesn’t. It’s the classic feedback loop problem, so like you said, procrastinating students devote lots of time to largely pointless but seemingly productive activities like typesetting.
I was there once. In hindsight all the tweaks were a complete waste of time. All I needed was amsart, plus beamer for slides.
It's because LaTeX gives us a sense of legitimacy. (it's also why people go overboard with math notation in LaTeX documents, even when prose is more appropriate).
It produces documents that look like those produced by professors, and luminaries in the field. If you write equations in Word Equation Editor, your work just doesn't look very serious.
It's the same joy I felt when I laser-printed my first newsletter designed in Aldus PageMaker. I was only in my teens but I felt like a "professional".
A small, but important aspect of typesetting/WYSIWYM is the ability to break down a large document (like a thesis) into discrete sub-components. You could work on each section of your document in an individual .tex file and include it later in your top-level .tex file. This setup works well with VCS like git.
Another ergonomic benefit is scripting. For example, if I'm running a series of scripts to generate figures/plots, LaTeX will pick up on the new files (if the filename is unmodified) and update those figures after recompiling. This is preferable to scrolling through a large document in MS Word and attempting to update each figure individually.
As the size and figure count of your document increases, the ergonomics in MS Word degrade. The initial setup effort in LaTeX becomes minimal as this cost is "amortized" over the document.
Another reason to use LaTeX for papers back in the day was that Microsoft Word would routinely corrupt large documents in terrifying ways. Sometimes the root of the corruption existed in the document somehow long before any of it was visible, so even recovering from an old backup would just lead to the problem repeating. I recall the only way to properly "recover" an old backup was to copy it all via plain text (e.g. Notepad), and then back into a brand new Word document.
This is all to say, if you're working on a theis or even a moderately large assignment, working in Word was not good for the nerves.
Looking back, I probably should have just worked in plain text and then worried about formatting only at the very end, but ummm, yes, I guess another hapless victim did indeed fall into LaTeX's trap. :)
I give 0 fs about typesetting. But typical mainstream software just cannot freaking process a 500 page document with tables, figures, references, equations etc. If Word/Pages/Openoffice/GoogleDocs could do it, no sane person would sink 100’s of hours in debugging latex out of memory errors.
But once you are in the latex world you start noticing how much prettier things can be. And then you end up sinking another thousand hours to perfectly aligning the summations in your multi-line equations.
From watching people write their thesis in both latex and word, I'd say if anything it is the other way around. The people who write their thesis in word (or another wysiwyg editor) spend more time on their layout than the people writing in latex. Worse, they spend the time while writing, while latex allows for separation of tasks, which allows people to get into the flow much more easily.
Sure, theoretically you can only concentrate on writing with word and ignore layout. In practice in takes a lot of discipline so instead you see people moving figures around putting spaces or returns to move a heading where they want to etc.. In particular as a way to procrastinate from actual writing.
For me personally, I have yet to figure out how to get a word processor to have text be justified on both sides without inserting big gaps between words. I could use left justified but then the text ends up looking like a saw blade, which is still ugly.
Latex' handling of floating figures and tables is also much better.
And of course math notation is much nicer to work with in LaTeX (IMO).
LaTeX typesetting is a solved problem. Memoir or Classic Thesis, paired with microtype, provide outstanding results and you need to spend zero time on tweaking stuff.
Typst is interesting, but it doesn't yet support all microtypography features provided by microtype. IMHO, those make a big difference.
I wrote my joint med-CS honours (1 year research thing we have in Aus) thesis in Word. My med supervisor was happy with it. CS supervised insisted I reformat it in LaTeX as he couldn't stand the typesetting.
Honestly I don't disagree with him, it looked far better in 'TeX. But that's probably a learnt preference.
Not all of us fell into that trap! My dissertation was written almost entirely using a default document class and a handful of packages, and only towards the end did I apply the university document style to come into compliance. I had more than enough to do on the subject of the PhD and didn’t have the patience to burn time on typesetting or fiddling with macros.
I’ve found in the decades since then that my most productive co-authors have been the ones who don’t think about typesetting and just use the basics. The ones who obsess over things like tikz or fancy macros for things like source layout and such: they get annoying fast.
Tikz is misplaced in this list; it is how you make any kind of vector drawings in LaTeX. It's not the only way, but perhaps the best documented and most expressive one. If you have any such drawings in your work, you won't get around putting some effort into it. Not comparable with boxed theorems or fancy headings.
I think the annoyance with TikZ is twofold: (1) it tries to do a really hard thing (create a picture with text in a human writable way), (2) it is used infrequently enough that it’s hard to learn through occasional use.
That said, nobody makes you use TikZ, fire up Inkscape and do it wysiwyg.
I'm quite glad some alternatives are popping up. Using LaTeX feels like piece of 80s tech to be honest. It is obviously fine and super powerful, but, like vim-style fine. There got to be more contemporary alternatives that status quo.
Not everyone is into nostalgia. I don't try to take away LaTeX or vim from anyone, it just not for everyone.
Pro tips: type long content unformatted, or barely formatted, then ask an LLM to format it using your markup of choice, then clean up the thing it got wrong.
They are very decent at inferring the context of stuff and will mark code, maths, titles so on farely decently. This lets you focus on the work of making it looks nice.
AI is the primary audience for our writing, and the primary reason to reconsider our choice of markup format. It's all about semantic compression: Typst source, markdown, and asciidoc are far more concise than LaTeX source.
I'm observing, not here to convince anyone. The last six months of my life have been turned upside down, trying to discover the right touch for working with AI on topological research and code. It's hard to find good advice. Like surfing, the hardest part is all these people on the beach whining how the waves are kind of rough.
AI can actually read SVG math diagrams better than most people. AI doesn't like reading LaTeX source any more than I do.
I get the journal argument, but really? Some thawed-out-of-a-glacier journal editors still insist on two column formats, as if anyone still prints to paper. I'm old enough to not care. I'm thinking of publishing my work as a silent animation, and only later reluctantly releasing my AI prompts in the form of Typst documentation for the code.
The initial motivation of LaTeX compile times being slow is very interesting to me.
I use LaTeX as a tool for layout of books to print for hobby bookbinding and my current project - a 3 megabyte, 500k word beast of a novel - only takes around 10 seconds to compile.
I cant imagine what the friend of the author here had going on in his paper such that his compile times took such a hit. Required use of specific LaTeX libraries dictated by the journals he was submitting to that were written inefficiently? specific LaTeX features or packages that end up hitting significantly slower codepaths?
Makes me wonder if its not LaTeX itself that is the speed issue but instead the very maturity of the ecosystem that it has as advantage over Typst. It could entirely be possible that once Typst has the wide berth of features and functionality available through its ecosystem, that it would become just as easy to fall into compile time tarpits.
My experience is the same as OP: Typst is significantly faster than Latex. My book has all of table of contents, parts and chapters, figures, code samples, tables, images, and bibliography. These are all going to require multiple passes to layout. E.g. you cannot insert page numbers until you have laid out the table of contents, as it comes before other content. However you cannot construct the table of contents before you have processed the rest of the document. A typical novel won't have most of these, and so I think it will be substantially easier to layout.
Related work: Racket has Scribble, which is used for some books, academic papers, and a lot of package API docs.
https://docs.racket-lang.org/scribble/getting-started.html#%...
Although it doesn't look like Scheme, it has the full power of Scheme.
Generally Typst looks like a significant improvment over LaTeX to me. The language is cleaner and easier to understand, and the first class scripting support is appealing. Its embeddability and templating features make it an interesting option for automated PDF generation (e.g. invoices) as well.
However its handling of introspection and convergence gives me a bad feeling.
Typst looks really promising, especially due to the fact that it had common templates (like the IEEE one) which produce content identical to LaTeX.
My biggest gripe with latex is the tooling. During my last paper, I ended up using a makefile which would usually work. When it didn’t work, running it twice would fix the issue. In the rarest cases, I had to run `git clean -xdf` and the next run would work.
I still have no idea what was going on, and most makefiles out there seem to be obscenely complex and simply parse the output and run the same commands again if a certain set of errors occurred.
The definition of insanity is doing the same thing twice and expecting different results.
By coincidence, this is the basic way to compile latex.
> My biggest gripe with latex is the tooling. During my last paper, I ended up using a makefile which would usually work. When it didn’t work, running it twice would fix the issue. In the rarest cases, I had to run `git clean -xdf` and the next run would work.
I always feel like I’m doing something wrong when I have to deal with LaTeX and lose hours to fighting with the tooling. Even with a clean install on a new machine it feels like something fails to work.
The last time I had to change a document I had to go through what felt like 100 different search results of people with the same issue before I found one where there was a resolution and it was completely obscure. I tried to help out by reposting the answer to a couple other locations, but I was so exhausted that I swore off LaTeX for any future work unless absolutely unavoidable.
Absolutely not a perfect solution, and maybe you're already using it within your Makefiles, but for anyone who doesn't yet know about it there's Latexmk[1] which is supposed to automate all of this hassle. I think at least on Debian it's included with texlive-full. In addition it has some nice flags like `-outdir` which lets you send all the crazy LaTeX intermediate build/aux files to a separate directory that's easy to gitignore.
https://mgeier.github.io/latexmk.html#running-latexmk
I think I used to understand this, but it's been a long time since I had to write any serious LaTeX, so I don't anymore. I found this snippet in my personal _quick-build-latex_ script from over a decade ago:
So I guess if you're using bibtex, then you need to run it three times, but otherwise only twice?This is to say... I'm glad those days are gone.
Just use Tectonic nowadays for compiling LaTeX source. It automatically handles these cases of compiling multiple times.
Why didn't you use latexmk? It deals with the recompiling for you.
One of the things that really interests me about Typst is that the compile process seems much more deterministic and modern
What do you mean by tooling? I've used LaTeX for decades to write books and papers and the combination with Emacs was flawless. The only major change for me was the transition from Bibtex to Biblatex.
I'm sticking with LaTeX, not as a fetish, but because journal/conferences still do not accept e.g. typst. Will they ever do? I don't know, depends on their willingness to integrate it into their toolchains I guess?
I sincerely doubt they will: most journals in pure math still do not accept LuaTeX; just think about that.
Yeah, that was my first thought. And it's not just about them accepting typst, but also whether they would provide a template using typst, like they currently do for latex. Using the conference/journal template to write the article saves a lot of time for both submitters and editors (who have to deal with hundreds, if not thousands of submissions).
There are already at least two publishers which accept Typst. So that "ever" part is already covered. But most still don't accept Typst and LaTeX is usually mandatory if the sources are required.
That is for sure my biggest concern with typst. I wrote a tool that can convert from typst to latex for final submissions, but it is a bit sketchy and at the moment won't handle math very well. https://gitlab.com/theZoq2/ttt
I'm not familiar with how journal submissions work, but don't you simply submit a pdf at the end? Does it matter what engine you used to render it?
In case anyone hasn't seen some typst source and renders, here's a few documents I whipped up:
First is based on Todd C. Miller's Latex Resume Template:
- https://typst.app/project/rDUHMUg5vxl4jQ5q2grGPY
Second is a Enduring Power of Attorney:
- https://typst.app/project/rs9ZgGLhgM7iPvFs7PQv5O
Third a will:
- https://typst.app/project/r45dVk6MpLjsoXMvxkTxsE
I’m gradually moving my work over to Typst and it’s been a breath of fresh air. Compiles very quickly.
Perhaps the hardest part has been relearning the syntax for math notation; Typst has some interesting opinions in this space.
Typst looks good, but I'm actually going back to LaTeX but paired with Claude Code in VS Code.
I took a hiatus from LaTeX (got my PhD more than a decade ago). I used to know TikZ commands by heart, and I used to write sophisticated preambles (lots of \newcommand). I still remember LaTeX math notation (it's in my muscle memory, and it's used everywhere including in Markdown), but I'd forgotten all the other stuff.
Claude Code, amazingly, knows all that other stuff. I just tell it what I want and it gets 95% of the way there in 1-2 shots.
Not only that, it can figure out the error messages. The biggest pain in the neck with LaTeX is figuring out what went wrong. With Claude, that's not such a big issue.
I hate a lot of things about LaTeX (also wrote several theses in it, as well as research articles), but the math syntax definitely wasn't one of them. Why on earth would they change it?
One relatively optimistic prediction would be that a few will accept Typst, but latex export from Typst will gradually get more mature, until we end up with a charade where more people use other frontends like Quarto or Typst that output to latex rather than latex themselves for submission into journals - in certain fields. Somewhere after that time, Typst will break through and be generally accepted itself.
mitex is an option [1]. There's no way I could learn another notation, at this point.
[1] https://typst.app/universe/package/mitex/
I tried typst a year ago, and I found it really nice to use compared to Latex. I even managed to make (or modify I don't remember) a small module to customize boxes, something I would not have even though of trying with latex.
I don't use latex anymore and I don't have a use case for typst, so I'm not currently using it, but I follow the advancements from time to time, and I have to disagree with the advisor.
Typst is perfectly fine for replacing latex in almost any place that doesn't require the latex source. The other case is because tthe ecosystem is much smaller so if you need a specific extension that does not exist or is not trivial to implement you'll be out of luck, and you'll be stuck with latex.
I have only two peeves with typst.
1. They should have carried forward the latex standard as-is for math, instead of getting rid of the backslash escape sequence, etc.
2. There is no way to share a variable across a file's scope - so can't have a setting that is shared across files - not even with state variables.
Other than this, typst is solid, and with the neovim editor and tinymist lsp, is great to write with.
Regarding point 1: I'm so glad they didn't keep the math syntax, there's finally progress in math text input! E.g. we can now write
instead of Regarding point 2: you can put your settings in a file `settings.typ` and import it from multiple files.Glad to hear Typst has people doing serious work with it.
I’ve been able to avoid LaTeX. At uni, I went for org-mode -> LaTeX, which was OK except when my .emacs file was filling up with LaTeX stuff to make random stuff work. To be honest, that means I probably can’t even compile it again if I wanted to.
Typst has been awesome (always ran into LaTeX just being horribly inconsistent when layout stuff) when I’ve used it. Hope it continues.
Typst really does feel refreshing in that sense… way less fiddly and a lot more predictable, especially for layout tweaks
| „[…] was a friend telling me his LaTeX thesis took 90 seconds to compile towards the end“
Sure, but in order to iterate you won’t have to compile the whole document but can just keep the chapter you are working on by structuring it with \includes
From https://www.latex-project.org/about/:
"LaTeX is not a word processor! Instead, LaTeX encourages authors not to worry too much about the appearance of their documents but to concentrate on getting the right content."
IMO, the only people that use LaTeX are people who are willing to trade the convenience and productivity of using a sane document authoring format for the warm and fuzzy feeling you get when you use an outdated piece of typesetting software that is a) hard to configure, b) hard to use and c) produces output for the least useful reading platform available (paged pdfs).
And the pronounciation is stupid.
> IMO, the only people that use LaTeX are people who are willing to trade the convenience and productivity of using a sane document authoring format for the warm and fuzzy feeling [...]
I hope you are aware that literally all research in mathematics and computer science is typed up and published in LaTeX?
Alternatively, they're people who write documents in a field where LaTeX is the standard, they're not computer savvy enough to try to even look for something new that might be acceptable or might compile to LaTeX, and at any rate they want to focus more on their research than they do on changing the typesetting norms in their field.
(No shade on people who do decide to use alternatives, and Typst is great!)
Do you feel the same about Markdown?
Just curious.
What deters me from Typst is that latex math syntax is nowadays ubiquitous. You write $x^2=1$ and it renders in many places. Learning a new syntax for math expressions is simply not in my interests.
The threat is literally the opposite, it is very freeing to be able to write typst syntax because it's quicker and easier to write. But then you're cursed by the fact that every other place now uses latex math syntax by convention.
To be fair $x^2=1$ literally works in typst.
It is very fast to learn the Typst math syntax. It is easy and intuitive and usually less verbose than LaTeX. It should not be a difficult thing to learn for most people.
> But in the Bibtex file it is very common for the titles to appear in their original title case form
That is common because they are following the rules about how to steer capitalisation when using bib(la)tex:
- If the entry is in English, and the style demands title case, output as is
- If the entry is in English, and the style demands sentence case, convert to sentence-case and output
- If the entry is not in English, output as is
> If the entry is in English, and the style demands sentence case, convert to sentence-case and output
Nope: not possible to automatically determine which capitalised nouns are proper (and thus remain capitalised in sentence case) and which are common (and thus become uncapitalised).
This is in fact why it is better to store sentence case: it can be unambiguously converted to title case while the reverse is ambiguous. It’s not mere preference.
It's been a decade since I wrote anything in LaTeX and I echo all it's pain points.
But it seems like LaTeX is the kind of thing that LLMs would nail perfectly. I feel like using it today wouldn't be very bad.
Tangential, do LLMs pick up new languages that have less internet discussion and which develop rapidly after knowledge cutoff dates? To naysayers, AIs are supposed to generate hands with 6 fingers and ossify language and framework versions.
Maybe if it's completely distinct. Else definitely no, unless, maybe, if the model is fine-tuned. Had a discussion about it with my dad whos work is developing in a non-mainstream SmallTalk dialect where it doesn't work at all.
I suppose it also depends on the specific LLM; the output of a free/low-cost model will likely be very different from a $200/month o1-pro.
I have so far not been able to get a major LLM to generate fully functional Typst code, no matter how much context I try to put into it. The models do not seem to currently understand Typst's concept of modes (code, markup and math) and especially in code mode suffers from heavy hallucination of syntax and/or semantics.
In thirty years LaTEX will still be open source and probably will be maintained.
Typst appears to be a mix of open source and closed source; the general model here tends to be neglecting the open source part and implementing critical features in the closed source portion. Which is to say, it's unlikely to live beyond the company itself.
Typst is fully open source licensed under Apache-2.0 license. It is not a mix of any kind. Don't confuse the web app with Typst engine. The web app is a similar service to Overleaf and that is closed source. It is not mandatory, you can use Typst fully on your local machine. The team tries to make money and cover development costs with the web app. But the actual typesetting engine is fully open source and free.
you are wrong. typst's lead dev has stated that an important goal is to have the CLI (which is open source) and web app behave identically, even refusing to implement such a basic feature as PDF embedding because, due to technical reasons, it is currently incompatible with this goal. [1]
typst, the project, is not by any means a "mix" of open and closed, even if typst, the company, is. indeed, the most thorough LSP implementation available (tinymist) is not only open source but a community project. for another funny example see typstify, a paid typst editor not affiliated with the company. [2]
[1]: https://github.com/typst/typst/issues/145#issuecomment-17531...
[2]: https://typstify.com/purchase/
I disagree. The web app editor is closed source, but much of what it provides is open source so editing is a similar (and imo better) experience locally. The typst compiler and LSP and everything you need to use it is open source.
Imo the situation is more like if overleaf were also the people who made the LaTeX project originally.
I think the only possible issue with the typst org dying (assuming after the full 1.0 version so it's mostly maintenance) is that packages are automatically downloaded from the typst site, but an open repo can trivially be made considering that the set of packages used is just from a open source git repo and the closed source site just hosts tar.gz files of the folders in the repo. Not a big deal I think.
That is a real concern, but I wouldn't say there are any critical features in the closed source portion. I wrote the whole thesis locally with only open source tools. One of the included papers was written in the cloud platform for collaboration.
It is a concern that there is a single company doing most of the development, but there is quite a bit of community involvement so I don't think it is an immediate concern
>In thirty years LaTEX will still be open source and probably will be maintained.
The latter is a genuine concern. Will it be maintained? I like LaTeX a lot, but would I want to maintain its internals? No. Could I? If I were paid handsomely, yes. Emphasis on handsomely.
Which leads to another worry: LaTeX itself may be OSS, but down the line it is possible that maintained forks will be controlled by big publishers paying maintainers to deal with the insanity of its internals. And we all know how lovely those publishers are (凸ಠ益ಠ)凸
> implementing critical features in the closed source portion
Like which critical features, for example?
> neglecting the open source part
So it's no different than fully open sourced projects.
On the flip side, new tools like Typst are trying to push the UX forward in ways that the LaTeX ecosystem often struggles with. I think it comes down to what risks you're comfortable with
Does that matter? The article is in PDF, as other latex generated PDFs.
I've used Typst to generate reports in multiple languages and it works pretty well for this! I just pass typst JSON with the report data and use it from there.
Especially in combination with file watching. Your script writes to the JSON file and the entire document and everything that depends on it updates automatically, often in less than a second.
Very cool! I ran into the multiple bibliography issue when attempting to typeset my grandmother's PHD thesis which I was able to rescue from the 5.25" floppies it was originally stored on. I was planning on waiting until they solved this officially to resume that side project, but might give Alexandria a shot!
That sounds like a fun project! Alexandria is the way to go for now but hopefully they will get proper support for it sooner rather than later.
Does Mendeley perform any better here than it does with overleaf?
I was on the typst train, particularly because its layout engine has some additional vertical control for long documents that latex lacks. However, just about when I was looking at moving over, LLM coding became good or at least good enough, and one area the current crop is bad at is doing layout in anything but latex. Not that they are good at latex, but they are terrible, terrible, terrible at typst. Really bad. Maybe in another year or six months!
I understand why people like using LLMs for coding, saves them having to think, but it is deeply frustrating to see it being such a crutch that some people cannot use new tools without it.
I suppose the issue is not new, many people didn't want to use new lanuages before because they couldn't copy snippets from the internet, but it was frustrating then too.
> and one area the current crop is bad at is doing layout in anything but latex. Not that they are good at latex, but they are terrible, terrible, terrible at typst
I'm surprised to hear that—I've been using GitHub Copilot with ConTeXt [0] since 2021, and it mostly works fairly well. And ConTeXt is much more obscure than Typst (but also much older, so maybe that gives it an advantage?).
[0] https://wiki.contextgarden.net/Introduction/Quick_Start
Well, they are good in markdown and rust. Perhaps feeding some Typst documentation overview into the prompt could solve it?
The ecosystem issues and rough edges in bibliography handling don't surprise me, but the fact that you could script so much directly inside the doc is really appealing
Is Typst’s typesetting quality on par with « bare » LaTeX ? with LaTeX + microtype ?
It may be stupid and vain but for me if it doesn’t at least match the former it’s a no-go
Until 0.13 it wasn't quite as good as latex in my experience, it mainly inserted more hyphens than LaTeX.
As of this version, it would be very hard to tell a difference in my experience
Yes, it uses a very similar algorith as LaTeX. It also incorporates already some microtype features out of the box. So the typesetting quality is very good and easily comparable to LaTeX. Working with Typst is so much easier and faster than with LaTeX so you will be more productive. Many things can be done without resorting to external packages and scripting is a breeze compared to LaTeX.
Just try it out. It is free, open source and very easy to setup. Just install the extension Tinymist on VSCode, that is all you need.
I've never had a big complaint about LaTeX, it's easy to get into and the results are stunning if you use it minimally and with care. They only thing I've always been missing was a way to make it easier to have perfect register-true typesetting for books. This has to do with LaTeX's paragraph flow algorithm and lack of global optimization, or so I've been told.
Can Typst provide better register-true layout? That would be interesting to me.
Typist will probably be dead or acquihired in a few years.
Latex will be around for decades.
The Typst compiler is completely open-source. I prefer my local copy of the Typst compiler and CLI to whatever LaTeX provides right now already, and there seems to be a still growing community that could keep the project going even after a malicious acquisition of some kind.
Congratulations to the author.
I have to agree that Typst source generally looks a lot less uglier than LaTeX. I considered writing stuff in Typst many times, but I couldn't master the courage to do so.
Nice debrief. I think tough some of the downsides the author mentions can be addressed relatively easily with quarto, which has embraces Typst since its early days as I recall. Especially the bibliography issue.
Congrats OP on your PhD!
It can be hard to write macros with state in typst.
It is hard to write macros in LaTeX.
i switched all of our pdf generation to typst - fantastic software. love how efficient it is; it makes previewing trivial and iteration very fast.
Interesting! Did you use a tool to do the conversion automatically? How did it pick up on custom packages and styling?
Great work. Screenshots would be nice.
Why not LyX or TeXmacs? Both seem to be better options than yet another markup language.
In addition to making it possible to write easily, TeXmacs is also based on a markup language. It demonstrates that a markup language and WYSIWYG writing can coexist efficiently.
Why do CS doctoral candidates have such a fascination with typesetting? I mean, be into whatever you’re into, I guess.
But as soon as someone starts talking about LaTEX and how they spent months on their macros, I think “another hapless victim has fallen into LaTEX’s trap.” It’s like an ant lion that feeds on procrastinating students.
I was a math major in undergrad, we care about typesetting so much because you really do not want to be stuck handwriting everything, but it's not easy to be faster typing than you are with handwriting when you're writing out rows and rows of equations. (Actually physics was generally a lot harder for me to keep up with while typing than math was.)
And when your life is revolving around classes or your thesis, the #1 most important thing to you in the world is how easily you can transfer your ideas to paper/digital format. It makes a lot of sense that people care a lot about the quality of their typesetting engine and exchange macro tips with each other (I got a lot of helpful advice from friends, and my default latex header was about 50% my own stuff and 50% copied from friends in my same major)
This is not limited to CS or Latex in any way. Plenty of students spend a lot of time fiddling with word, powerpoint, note taking systems, citation management (which is surprisingly horrible in MS word), Adobe software etc..
Obvious reasons:
- Your thesis is a major output of years of work. Of course you want it to look good.
- You might think it superficial, but if the presentation looks bad, many people (subconsciously) interpret this as a lack of care and attention. Just like an email with typos feels unprofessional even if the content is otherwise fine.
- Spending time on tooling feels productive even if it is not past a certain point.
- People that are into typesetting now have an excuse to spend time on it.
That said, in my experience people spent a few hours to learn "enough" latex several years ago and almost never write any macros. Simple reason: you work with other people and different journal templates, so the less custom code the better.
Time spent on typesetting produces immediately visible results (however minor). Actual research doesn’t. It’s the classic feedback loop problem, so like you said, procrastinating students devote lots of time to largely pointless but seemingly productive activities like typesetting.
I was there once. In hindsight all the tweaks were a complete waste of time. All I needed was amsart, plus beamer for slides.
It's because LaTeX gives us a sense of legitimacy. (it's also why people go overboard with math notation in LaTeX documents, even when prose is more appropriate).
It produces documents that look like those produced by professors, and luminaries in the field. If you write equations in Word Equation Editor, your work just doesn't look very serious.
It's the same joy I felt when I laser-printed my first newsletter designed in Aldus PageMaker. I was only in my teens but I felt like a "professional".
A small, but important aspect of typesetting/WYSIWYM is the ability to break down a large document (like a thesis) into discrete sub-components. You could work on each section of your document in an individual .tex file and include it later in your top-level .tex file. This setup works well with VCS like git.
Another ergonomic benefit is scripting. For example, if I'm running a series of scripts to generate figures/plots, LaTeX will pick up on the new files (if the filename is unmodified) and update those figures after recompiling. This is preferable to scrolling through a large document in MS Word and attempting to update each figure individually.
As the size and figure count of your document increases, the ergonomics in MS Word degrade. The initial setup effort in LaTeX becomes minimal as this cost is "amortized" over the document.
Another reason to use LaTeX for papers back in the day was that Microsoft Word would routinely corrupt large documents in terrifying ways. Sometimes the root of the corruption existed in the document somehow long before any of it was visible, so even recovering from an old backup would just lead to the problem repeating. I recall the only way to properly "recover" an old backup was to copy it all via plain text (e.g. Notepad), and then back into a brand new Word document.
This is all to say, if you're working on a theis or even a moderately large assignment, working in Word was not good for the nerves.
Looking back, I probably should have just worked in plain text and then worried about formatting only at the very end, but ummm, yes, I guess another hapless victim did indeed fall into LaTeX's trap. :)
I give 0 fs about typesetting. But typical mainstream software just cannot freaking process a 500 page document with tables, figures, references, equations etc. If Word/Pages/Openoffice/GoogleDocs could do it, no sane person would sink 100’s of hours in debugging latex out of memory errors.
But once you are in the latex world you start noticing how much prettier things can be. And then you end up sinking another thousand hours to perfectly aligning the summations in your multi-line equations.
From watching people write their thesis in both latex and word, I'd say if anything it is the other way around. The people who write their thesis in word (or another wysiwyg editor) spend more time on their layout than the people writing in latex. Worse, they spend the time while writing, while latex allows for separation of tasks, which allows people to get into the flow much more easily.
Sure, theoretically you can only concentrate on writing with word and ignore layout. In practice in takes a lot of discipline so instead you see people moving figures around putting spaces or returns to move a heading where they want to etc.. In particular as a way to procrastinate from actual writing.
For me personally, I have yet to figure out how to get a word processor to have text be justified on both sides without inserting big gaps between words. I could use left justified but then the text ends up looking like a saw blade, which is still ugly.
Latex' handling of floating figures and tables is also much better.
And of course math notation is much nicer to work with in LaTeX (IMO).
LaTeX typesetting is a solved problem. Memoir or Classic Thesis, paired with microtype, provide outstanding results and you need to spend zero time on tweaking stuff.
Typst is interesting, but it doesn't yet support all microtypography features provided by microtype. IMHO, those make a big difference.
I wrote my joint med-CS honours (1 year research thing we have in Aus) thesis in Word. My med supervisor was happy with it. CS supervised insisted I reformat it in LaTeX as he couldn't stand the typesetting.
Honestly I don't disagree with him, it looked far better in 'TeX. But that's probably a learnt preference.
In essence, it's culture.
Not all of us fell into that trap! My dissertation was written almost entirely using a default document class and a handful of packages, and only towards the end did I apply the university document style to come into compliance. I had more than enough to do on the subject of the PhD and didn’t have the patience to burn time on typesetting or fiddling with macros.
I’ve found in the decades since then that my most productive co-authors have been the ones who don’t think about typesetting and just use the basics. The ones who obsess over things like tikz or fancy macros for things like source layout and such: they get annoying fast.
Tikz is misplaced in this list; it is how you make any kind of vector drawings in LaTeX. It's not the only way, but perhaps the best documented and most expressive one. If you have any such drawings in your work, you won't get around putting some effort into it. Not comparable with boxed theorems or fancy headings.
Tikz is sometimes useful, but it can also be a massive time sucking pain in the butt.
I mean it is one of the few packages that can actually manage to annoy LaTeX fans, which is really saying something.
I think the annoyance with TikZ is twofold: (1) it tries to do a really hard thing (create a picture with text in a human writable way), (2) it is used infrequently enough that it’s hard to learn through occasional use.
That said, nobody makes you use TikZ, fire up Inkscape and do it wysiwyg.
I'm quite glad some alternatives are popping up. Using LaTeX feels like piece of 80s tech to be honest. It is obviously fine and super powerful, but, like vim-style fine. There got to be more contemporary alternatives that status quo.
Not everyone is into nostalgia. I don't try to take away LaTeX or vim from anyone, it just not for everyone.
Pro tips: type long content unformatted, or barely formatted, then ask an LLM to format it using your markup of choice, then clean up the thing it got wrong.
They are very decent at inferring the context of stuff and will mark code, maths, titles so on farely decently. This lets you focus on the work of making it looks nice.
Why not use javascript, JSX and TypScript to produce PDF? You use the language you know already.
AI is the primary audience for our writing, and the primary reason to reconsider our choice of markup format. It's all about semantic compression: Typst source, markdown, and asciidoc are far more concise than LaTeX source.
I'm observing, not here to convince anyone. The last six months of my life have been turned upside down, trying to discover the right touch for working with AI on topological research and code. It's hard to find good advice. Like surfing, the hardest part is all these people on the beach whining how the waves are kind of rough.
AI can actually read SVG math diagrams better than most people. AI doesn't like reading LaTeX source any more than I do.
I get the journal argument, but really? Some thawed-out-of-a-glacier journal editors still insist on two column formats, as if anyone still prints to paper. I'm old enough to not care. I'm thinking of publishing my work as a silent animation, and only later reluctantly releasing my AI prompts in the form of Typst documentation for the code.