Fortran

Alain Ketterlin alain at dpt-info.u-strasbg.fr
Tue May 13 13:33:32 EDT 2014


Mark H Harris <harrismh777 at gmail.com> writes:

> On 5/12/14 3:44 AM, Alain Ketterlin wrote:

>> When you are doing scientific computation, this overhead is
>> unacceptable, because you'll have zillions of computations to perform.
>
>     I'm still trying to sort that out. I have not tested this yet, but
> it looks like Julia is fully dynamic (yes it has types too), and it
> does parallel processing at its core, so the zillions of computations
> are being handled all at once, depending on how many processors|cores
> you have.

The amount of parallelism is a property of the program, not a property
of the language... I hope you don't expect all your julia statements to
execute in parallel.

(And Fortan/C/C++ have leveraged OpenMP and MPI for years. Julia's
distributed arrays and fork-join model may make things a bit easier for
the programmer, but I doubt it will lead to better performance.)

>> Julia provides a way to make things fast: typing. If you provide
>> explicit types, the dynamic typing part obviously disappears, and
>> the overhead is removed.
>
>     Julia is dynamic (depending on how far you want to go with that)
> but what makes it fast is the JIT. It is almost accomplishing C/C++
> and FORTRAN speeds (even though dynamic) because it is compiling
> on-the-fly.

JITing by itself has little impact on performance (as unladden-swallow
has shown for python). For the JIT to produce code as fast as a
Fortan/C/C++ compiler would, two conditions must be met: 1) the JIT has
to have as much information about types as the statically-typed language
compilers have, and 2) the time spent by the JIT compiler has to be
negligible compared to the time spent executing the code.

If your program is not statically typed, the code produced by the JIT is
very unlikely to be more efficient than the code that would be executed
by the virtual machine (or interpreter), because the jitted code needs
to do everything the virtual machine does. And the JIT takes time.

[...]
>     Yes, and more+  Gnu GMP & MPFR are not new to me, but the wrapper
> and repl are !  I am just realizing the power behind the libraries in
> this context, but I am very impressed with the functionality wrapped
> around the Gnu stuff... the interface is quite nice.

Good. I don't know mathlab enough to see how it compares.

>> Sorry, I was comparing to Fortran, and it's use in scientific computing.
>> Self modifying code is totally irrelevant there.
>
>    no, no, no...  there has to be a value add for scientists to move
> away from R or Matlab, or from FORTRAN. Why go to the trouble?
[...]
>    Why?, glad you asked.  Enter self modifying code for one. The
> influence of Lisp|Scheme is potentially huge here.

I wouldn't bet a penny on this. (Probably because I have yet to see a
real, convincing application of self-modifying code. I mean something
which is not covered by JITs---GPU shaders, and things like that.)

The real nice thing that makes Julia a different language is the
optional static typing, which the JIT can use to produce efficient code.
It's the only meaningful difference with the current state of python.

-- Alain.



More information about the Python-list mailing list