"near clone" is a bit exaggerated. As much as I'm a free software zealot, I don't think Octave comes close to matlab yet (provided you do anything a bit more advanced than the practical of some courses)
I don’t think Matlab or Octave are great languages for software engineering. Actually, these languages are like example #1 of the difference between engineering software vs software engineering: they are excellent tools for writing, like, 10-100 line numerical experiments.
Anyone who runs up against a limitation of Octave has probably hit the point where they should consider switching, but not to Matlab or some other scripting language, but to Fortran or maybe Julia or something.
Therefore, I disagree with the accepted answer in that StackOverflow thread. The language is only good in the first place for short codes anyway, so fixing any little octave/matlab regionalisms is not a big deal. And, since it is a mathematical experiments, you should understand what every line of code does, so running the code without reading it is not really an option.
There's plenty of satellites, rockets, re-entry vehicles whose guidance and control code were designed and written using MATLAB/Simulink and then "autocoded" to C using "MATLAB Coder".
While not my preferred way of doing things, it is popular for this purpose throughout the aerospace industry.
They are never meant for general software engineer but for numerical analysis/data analysis and engineering. In fact they are quite horrible for writing general software code -- the APIs for IO and HTTP requests are very lacking compared what you can find in other languages, for example.
I haven't found a better CLI calculator utility for writing more than one-liner numerical stuff with some plots than MATLAB and octave. They're fantastic.
Why does this matter in the least? Like you must understand that this is a library call right? Like just put `import numpy as np` in your PYTHONSTARTUP and it's the exact same UX in python.
Matlab/Octave is great for numerical programs that perform within an order of magnitude of Fortran. If some things aren't fast enough, you can rewrite them in C or Fortran without too much trouble. If you're doing anything other than numerical computing, it's awful, and you should use a different language.
(Source: I did a PhD using a mixture of Octave for numerical stuff, Perl for text-processing and automation, and C++ for the parts that were too slow. Choose the right tool for the job.)
I was in one of those early cohorts that used Octave, one of the things the course had to deal with was that at the time (I don't know about now) Octave did not ship with an optimization function suitable for the coursework so we ended up using an implementation of `fmincg` provided along with the homework by the course staff. If you're following along with the lectures, you might need to track down that file, it's probably available somewhere.
Using Octave for a beginning ML class felt like the worst of both worlds - you got the awkward, ugly language of MATLAB without any of the upsides of MATLAB-the-product because it didn't have the GUI environment or the huge pile of toolbox functions. None of that is meant as criticism at Octave as a project, it's fine for what it is, it just ended up being more of a stumbling block for beginners than a booster in that specific context.
The real value of a degree unfortunately isnt the education it's the exclusivity of the program. When bootcamps realized this some started having more stringent admissions.
There's a great recent book (Anne Trumbore's _The Teacher_in_The_Machine_) on using technology to "disrupt" education (starting much earlier than you would think, with mechanical devices in the early 20th century that could drill students with multiple choice questions, running through basically pre-computer MOOCS that used radio and then TV to broadcast lectures, various educational software, and finally MOOCs like Coursera and Udacity).
I'm not a Matlab user, but from what I can tell, even if the language can be cloned, there's a lot more to Matlab: It's a GUI driven software suite, with a lot of pre-written apps that eliminate the need for coding in many cases.
It comes with vendor support and "official-ness" for lack of a better word.
Things are changing rapidly in this area but it wasn't very long ago that most people reacted to open-source software as something weird that shouldn't be trusted.
For anyone else who hadn’t heard of JupyterLite — it’s like Jupyter Notebook/Lab, but it runs completely in your browser. No servers, no backend — everything executes client-side.
It’s slower than native, sure — but for education, it’s a game changer. Students can open a notebook in any browser, on any device (even a Chromebook or iPad), and start coding instantly — no installs, no setup issues. Perfect for workshops, classrooms, or sharing interactive tutorials. It runs real Python, so you can teach core concepts, plotting, and even simple data analysis right in the browser. For heavier computation, you’d still offload to a remote kernel, but for learning and experimentation, it’s more than fast enough.
Hmm. Do we expect X on Y to have run times more like X*Y or max(X,Y)? Or maybe some more complicated combination because you have to pay both their overheads but then once things start cranking you are just paying the per-element cost of one of the languages…
I'm not an expert. I speculate that the compiler is unlikely to optimize the wasm binary better than an x86 binary. Furthermore, every VM instruction is on average going to need more than 1 cpu instructions to be executed. Intuitively, that would suggest slower execution. That is also what we see happen in practice with VMs.
Python is not a particularly fast language in the first place due to bad utilization of memory, hash table lookups everywhere and a high function call overhead.
Always found the attraction is buried all those issue bursting enjoyment by the author. Should the diagram be up front and possibly the next release features … then the making of or the issue of making of …
Unfortunately it looks like they did it wrong, by providing explicit GPU types and functions, instead of converting unmodified Octave code to run directly with GPU acceleration implicitly:
It would be awesome if Octave got implicit GPU acceleration in the browser with something like OpenCL. Unfortunately it looks like OpenCL was never ported to WebGL, so WebCL isn't implemented yet:
It's always astonishing to me how the obvious path is rarely taken by industry, because writing open solutions is self-evidently less profitable than writing proprietary ones. Look up the history of the blue LED and countless other innovations to see how that works and why.
I'm hopeful that AI will relieve programmer burden enough that we can explore these obvious roads not traveled. Because we're off on a very long tangent from what mainline computer science evolution might have looked like without tech's wealth inequality.
Unfortunately I see two major (rarely discussed) pitfalls looming with AI:
1) Every tech innovation brings a higher workload for the same pay. The amount of knowledge required to be a full stack developer in 2025 in higher than in 2015, which was higher than in 2005, which was higher than in 1995, and so on. Yet starting pay has not increased with inflation.
2) With AI bringing pair programming everywhere, we may see a decline in overall code quality if humans don't have to deal with it directly. Extended pair programming can lead to over-engineered codebases that can only be read by teams of humans instead of individuals. So whereas one untrained hobbyist could build a website in 1995 using principles like data-driven design, declarative programming and idempotence, today it requires a team to untangle the eventualities of imperative nondetermistic async code that from a user perspective is equivalent to simply hiding the progress bar in the browser.
That's why I'm such a proponent of alternative methods. Abstractions that are quite verbose to represent in, say, Python, can be expressed as one-liners in Octave. The only way to get more concise would be to move towards more of a functional assembly language like Lisp, at the cost of the syntactic sugar provided by array-based languages.
TL;DR: I believe that the most direct path from J.A.R.V.I.S./Star Trek style AI prompts to readable but efficient code is through DSLs like Octave/MATLAB, and some of the lost ways of doing business logic in the 1980s like Spreadsheets, HyperCard and Microsoft Access or FileMaker. Open tools like a GPU accelerated Octave would help us gain more leverage in writing software and possibly speed the evolution of AI itself by helping us more closely express abstractions in code.
For anyone else who hadn't heard of Octave, it's an open source near-clone of the proprietary MATLAB: https://en.wikipedia.org/wiki/GNU_Octave
"near clone" is a bit exaggerated. As much as I'm a free software zealot, I don't think Octave comes close to matlab yet (provided you do anything a bit more advanced than the practical of some courses)
See https://stackoverflow.com/questions/12084246/differences-bet...
I don’t think Matlab or Octave are great languages for software engineering. Actually, these languages are like example #1 of the difference between engineering software vs software engineering: they are excellent tools for writing, like, 10-100 line numerical experiments.
Anyone who runs up against a limitation of Octave has probably hit the point where they should consider switching, but not to Matlab or some other scripting language, but to Fortran or maybe Julia or something.
Therefore, I disagree with the accepted answer in that StackOverflow thread. The language is only good in the first place for short codes anyway, so fixing any little octave/matlab regionalisms is not a big deal. And, since it is a mathematical experiments, you should understand what every line of code does, so running the code without reading it is not really an option.
> 10-100 line numerical experiments
There's plenty of satellites, rockets, re-entry vehicles whose guidance and control code were designed and written using MATLAB/Simulink and then "autocoded" to C using "MATLAB Coder".
While not my preferred way of doing things, it is popular for this purpose throughout the aerospace industry.
They are never meant for general software engineer but for numerical analysis/data analysis and engineering. In fact they are quite horrible for writing general software code -- the APIs for IO and HTTP requests are very lacking compared what you can find in other languages, for example.
I haven't found a better CLI calculator utility for writing more than one-liner numerical stuff with some plots than MATLAB and octave. They're fantastic.
Python is trash, by comparison.
You think MATLAB is better than (checks notes) a scripting language for writing one liners/throwaway code? Is that what you're saying here? Lol
>a scripting language for writing one liners/throwaway code
Just objectively not an accurate description of Python
People who use MATLAB use it for the toolboxes.
The language itself is awful.
> The language itself is awful.
As a programming language freak, I must disagree... in what other programming language can you solve a linear system Ax=b in one line
without any external libraries or imports, just with the base language?I never used any official matlab "toolbox", but still love the language via the octave interpreter. It's so clean and straightforward!
Well, in Julia, for one.
> without any external libraries or imports
Why does this matter in the least? Like you must understand that this is a library call right? Like just put `import numpy as np` in your PYTHONSTARTUP and it's the exact same UX in python.
https://docs.python.org/3/using/cmdline.html#envvar-PYTHONST...
Matlab/Octave is great for numerical programs that perform within an order of magnitude of Fortran. If some things aren't fast enough, you can rewrite them in C or Fortran without too much trouble. If you're doing anything other than numerical computing, it's awful, and you should use a different language.
(Source: I did a PhD using a mixture of Octave for numerical stuff, Perl for text-processing and automation, and C++ for the parts that were too slow. Choose the right tool for the job.)
Early versions of Andrew Ng's ML MOOC used Octave, if you are looking for examples and exercises.
YouTube playlist: https://www.youtube.com/playlist?list=PLiPvV5TNogxIS4bHQVW4p...
I was in one of those early cohorts that used Octave, one of the things the course had to deal with was that at the time (I don't know about now) Octave did not ship with an optimization function suitable for the coursework so we ended up using an implementation of `fmincg` provided along with the homework by the course staff. If you're following along with the lectures, you might need to track down that file, it's probably available somewhere.
Using Octave for a beginning ML class felt like the worst of both worlds - you got the awkward, ugly language of MATLAB without any of the upsides of MATLAB-the-product because it didn't have the GUI environment or the huge pile of toolbox functions. None of that is meant as criticism at Octave as a project, it's fine for what it is, it just ended up being more of a stumbling block for beginners than a booster in that specific context.
It’s nice to know that someone else suffered this pain. And that i bet on PGMs which really turned out to be the wrong horse…
ha! I took at least one PGM class myself. I had a difficult time with the material.
Oh, the times when Coursera and Udacity were just starting... They were supposed to disrupt academia, it's a shame they never actually did.
The real value of a degree unfortunately isnt the education it's the exclusivity of the program. When bootcamps realized this some started having more stringent admissions.
There's a great recent book (Anne Trumbore's _The Teacher_in_The_Machine_) on using technology to "disrupt" education (starting much earlier than you would think, with mechanical devices in the early 20th century that could drill students with multiple choice questions, running through basically pre-computer MOOCS that used radio and then TV to broadcast lectures, various educational software, and finally MOOCs like Coursera and Udacity).
I used octave in place of matlab in undergrad numerical analysis course 15 years ago. The language was completely compatible for what we did.
I'm not a Matlab user, but from what I can tell, even if the language can be cloned, there's a lot more to Matlab: It's a GUI driven software suite, with a lot of pre-written apps that eliminate the need for coding in many cases.
It comes with vendor support and "official-ness" for lack of a better word.
Things are changing rapidly in this area but it wasn't very long ago that most people reacted to open-source software as something weird that shouldn't be trusted.
Scilab is another MATLAB clone, but emphasizes features rather than compatibility.
For anyone else who hadn’t heard of JupyterLite — it’s like Jupyter Notebook/Lab, but it runs completely in your browser. No servers, no backend — everything executes client-side.
Python on Web Assembly has to be really slow.
It’s slower than native, sure — but for education, it’s a game changer. Students can open a notebook in any browser, on any device (even a Chromebook or iPad), and start coding instantly — no installs, no setup issues. Perfect for workshops, classrooms, or sharing interactive tutorials. It runs real Python, so you can teach core concepts, plotting, and even simple data analysis right in the browser. For heavier computation, you’d still offload to a remote kernel, but for learning and experimentation, it’s more than fast enough.
Thank you. I just tried it on my iPad, very cool.
Hmm. Do we expect X on Y to have run times more like X*Y or max(X,Y)? Or maybe some more complicated combination because you have to pay both their overheads but then once things start cranking you are just paying the per-element cost of one of the languages…
I'm not an expert. I speculate that the compiler is unlikely to optimize the wasm binary better than an x86 binary. Furthermore, every VM instruction is on average going to need more than 1 cpu instructions to be executed. Intuitively, that would suggest slower execution. That is also what we see happen in practice with VMs.
Python is not a particularly fast language in the first place due to bad utilization of memory, hash table lookups everywhere and a high function call overhead.
Always found the attraction is buried all those issue bursting enjoyment by the author. Should the diagram be up front and possibly the next release features … then the making of or the issue of making of …
This is great! I always wanted a GNU Octave transpiled to other languages.
Octave could be embedded as a C library for some time:
https://stackoverflow.com/questions/9246444/how-to-embed-the...
https://docs.octave.org/latest/Standalone-Programs.html
There is an OpenCL package to provide GPU acceleration:
https://gnu-octave.github.io/packages/ocl/
Unfortunately it looks like they did it wrong, by providing explicit GPU types and functions, instead of converting unmodified Octave code to run directly with GPU acceleration implicitly:
https://octave.sourceforge.io/ocl/function/oclArray.html
It would be awesome if Octave got implicit GPU acceleration in the browser with something like OpenCL. Unfortunately it looks like OpenCL was never ported to WebGL, so WebCL isn't implemented yet:
https://en.wikipedia.org/wiki/WebCL
https://www.khronos.org/webcl/
WebCL is apparently being replaced by WebGPU:
https://stackoverflow.com/questions/11532281/how-to-use-webc...
https://gpuweb.github.io/gpuweb/
https://developer.chrome.com/docs/capabilities/web-apis/gpu-...
- unsolicited opinion -
It's always astonishing to me how the obvious path is rarely taken by industry, because writing open solutions is self-evidently less profitable than writing proprietary ones. Look up the history of the blue LED and countless other innovations to see how that works and why.
I'm hopeful that AI will relieve programmer burden enough that we can explore these obvious roads not traveled. Because we're off on a very long tangent from what mainline computer science evolution might have looked like without tech's wealth inequality.
Unfortunately I see two major (rarely discussed) pitfalls looming with AI:
1) Every tech innovation brings a higher workload for the same pay. The amount of knowledge required to be a full stack developer in 2025 in higher than in 2015, which was higher than in 2005, which was higher than in 1995, and so on. Yet starting pay has not increased with inflation.
2) With AI bringing pair programming everywhere, we may see a decline in overall code quality if humans don't have to deal with it directly. Extended pair programming can lead to over-engineered codebases that can only be read by teams of humans instead of individuals. So whereas one untrained hobbyist could build a website in 1995 using principles like data-driven design, declarative programming and idempotence, today it requires a team to untangle the eventualities of imperative nondetermistic async code that from a user perspective is equivalent to simply hiding the progress bar in the browser.
That's why I'm such a proponent of alternative methods. Abstractions that are quite verbose to represent in, say, Python, can be expressed as one-liners in Octave. The only way to get more concise would be to move towards more of a functional assembly language like Lisp, at the cost of the syntactic sugar provided by array-based languages.
TL;DR: I believe that the most direct path from J.A.R.V.I.S./Star Trek style AI prompts to readable but efficient code is through DSLs like Octave/MATLAB, and some of the lost ways of doing business logic in the 1980s like Spreadsheets, HyperCard and Microsoft Access or FileMaker. Open tools like a GPU accelerated Octave would help us gain more leverage in writing software and possibly speed the evolution of AI itself by helping us more closely express abstractions in code.
> alternative methods ... DSLs
This strongly agrees with you: https://alexalejandre.com/languages/end-of-programming-langs...