If all you are doing is run-of-the-mill serial algorithms (or even simple parallel algorithms), I don't think you need
C++ these days. However, a lot of these companies have huge investments in legacy software, which probably explains their focus on maintaining
C++. But I do see many quant job adds now asking for
Python, R and Java. It probably depends on how old the firm is. Long term, I see C/
C++ programmers becoming like the old COBOL coders who were suddenly hot during Y2K when all of the legacy banking code had to be updated. However, we are probably 10 or more years away from that day.
If you want to do real high performance computing, C or
C++ is the best way to go. There is a great body of HPC libraries, dating back from the 70s (LAPACK, BLAS, MPI) and newer variants like MKL, CUDA libraries, PETSc. These are all either FORTRAN or have C wrappers.
If you use interpreted/byte code languages like Java, C#, Matlab or
Python, sooner or later you bump up against the limitations. For instance, I've found Java has trouble when too many files are open (it doesn't close file handles rapidly) or with efficient memory management/garbage collection. When you get to the thousands of files and tens of GB level, these things start popping up.
That said, I think you could design a very effective byte code language for HPC. The main problem is that all of the decades of work was done in FORTRAN or C. I do wish organizations would scrap some of that or at least write
good wrappers for modern languages. Its even hard to find
C++ wrappers for most of this stuff.