Wednesday, December 10, 2008

Short-Answer Interview Questions

1. What's the difference between passing by value and passing by reference?

2. Describe the singleton pattern.

3. What's the difference between a static method and a static variable?

4. What's the difference between subclasses and interfaces?

5. What are generics?

6. What is the fastest sort? List some O(n log(n)) sorts. What is the pivot in QuickSort? How does the choice of pivot affect QuickSort's runtime?

7. What is the hashcode() / equals() contract in Java?

8. A user enters a URL in his browser, let's say "google.com/index.html". He presses enter, and a little whle later the page renders. Describe what happens in the background.

9. What is AJAX?

Answers coming soon.

Technical Interview Coding Question

Q: We have a very large array A containing positive and negative integers. We want to find a slice (any slice) of this array (i.e. a sub-array of consecutive elements) with at least two elements, such that the sum of the elements in this slice is equal to 0. Bonus: find the largest such slice.

A: We have to beat the brute force solution: checking every possible subarray length & sum.
  bruteForce:
for _length=1..n
for _start=0..n-length
sum elements from _start to _start+_length
if sum = 0
return _start, _start+_length
return false
The time complexity is O(n^2), and the space is constant. We can sacrifice some space to decrease the time; importantly, we can use the silver bullet of all algorithms questions, the hash table. To do this, first find all the partial sums of subarrays that start at index 0. Assuming none of those sum to 0, we notice that if we subtract one partial sum from another, we get 0's. This subtraction represents finding a zero-summing subarray that doesn't start at index 0. So, technially, we could find a subarray by subtracting the partial sums from each other. This doesn't decrease complexity at all, but let's explore the idea. Note that partial sums that can be subtracted from each other to get 0 have to be identical. Ding! We've just redefined the problem: how do we efficiently find two matching elements in an array? At this point, it should be clear that hashing each element and looking for collisions will yield the answer in O(n) time. Returning the indexes of the subarray is trivial: we just attach metadata to the partial sums to represent the end index of the partial sum. Adding the hash table and storing partial sums adds only O(n) space, which is still reasonable.

For the bonus, we can find the longest such subarray by iterating all the way through the partial-sum array, keeping track of the longest subarray so far (by subtracting _end from _start). Code for the non-bonus version below:
  hashMagic: input=array a
define array b, b[0] = a[0]
for i=1..n
b[i] = b[i-1]+a[i]

for j=1..n
hash key = b[j], value = j
if there is a collision
_start = lookup(key = b[j])
_end = j
return _start, _end

return false