My programming credentials:
3 credits of
C++.
6 credits of Java.
6 credits of AMPL.
12 credits of R.
As for OOP, I understand its concepts. Writing a giant program is similar to building a snap-together puzzle. You take a little bit of syntax, turn it into an object, which is like a little puzzle piece, and snap together all those pieces to create the picture on the box. The entire point of OOP is that the code is encapsulated and every error can be tracked down, pinpointed, and fixed.
I know why you use a linked list vs. an array (list and its derivative data structures are much easier to append...however, an array is much easier to read).
Inheritance: adding a special case of an object. Mountain bike extends bike. So does BMX in a different way. Stock and bond both extend financial instrument. Etc...
Overloading: having a class's extension use a method with an identical name. So if Bike has a method Go, and mountain bike has a method Go, mountain bike overloads bike.
Polymorphism: casting one variable type as another. Used mostly (I'd guess) to go from string to double/int and back.
Virtual functions: I forget this concept. I believe abstract functions are the ones from abstract classes that you simply have as placeholders for inheritors.
Model View Control: a common programming paradigm. The model is controlled through the control methods and viewed through the view methods.
Pass by value/reference: pass by the object's variable value, or its address in memory. Depends on the data structure you use. You access an array's contents by PBV but a list's elements by PBR, since you have to follow the memory path.
Big O notation: axiomatic notation--the running time of a program as determined by the size of its input. P vs. NP: whether or not the amount of decisions your program has to make increases as a polynomial function of the size of the input, or in an exponential fashion. NP complete are the worst kinds of these, as with a large enough input, they'll take until the universe comes to an end twice over to arrive at a solution.
Search/sort: floor at O(nlog(n)), because the fastest sort, quicksort, takes O(nlog(n)).
Recursion: calls a smaller version of itself, bubbling down to the base case, and then bubbling back up to produce the answer.
Quicksort: uses a divide and conquer sorting mechanism. Select a pivot, loop through the data structure, and put everything larger than the pivot to the right, and smaller than the pivot to the left. Recursively split the data structure into two (I forget whether or not right in the middle or on the pivot) and repeat this process until you have a bunch of elements of size 2 and then sort them with a swap mechanism (create temp variable, swap variable values) and then reassemble your data structure.
Types of data structures:
Arrays. Are tables.
Linked lists: your basic node and pointer type of data structure. Each node contains a data value and points to the memory address of the next node in the list in memory. Nodes can be singly or doubly linked, or may even go so far as to have a link to every other node in the list.
Queue: a linked data structure in which operations are performed on a first-entered first-done basis.
Stack: a linked data structure in which operations are performed on a last-in first-out basis.
Binary tree: a linked data structure with each root node having two leaf nodes.
Heap: a binary tree with an arbitrary amount of leaves per root node.
Hash table: a type of data structure used to quickly compare two files.
Red-black trees: never covered these. Sorry.
Depth/breadth first search: the basic graph searching algorithms, with all edge weights equal to 1. I forget their complexity, but believe it's O(nlogn)
Kruskall's/Prim's algorithm: minimum spanning tree algorithms. Kruskall's takes a bunch of tiny blobs and turns into one giant blob. Prim's starts a blob at one of these tiny blobs and has it absorb the rest of the tiny blobs until it becomes one giant blob. Each expands the MST by greedily finding the smallest edge weight not in the tree to an unvisited node and connecting it.
Traveling salesman: shortest distance to cover an arbitrary set of points and return to the starting location. I forget the exact complexity of this, but I believe it's NP complete. Not N^n, and not N!. Perhaps O(2^n)
Djysktra's algorithm: The shortest path from one point to another. Takes the shortest possible path each time one node ahead. Assumes no negative weights. O(Nlogn)
All pairs shortest path: O(n^3) to create a table of the shortest distance between any two points.
LU decomposition/householder QR decomposition: linear algebra transformations in order to turn an arbitrary matrix into an upper triangular for the purpose of numerical stability and ease of calculation. Both run in O(n^3).
So yeah...I definitely saw the concepts before. But A) they were in sophomore year of Lehigh and B) I don't know the syntax to code them. I certainly know what they do and how to "pseudocode" them, but don't know the exact code per language.