Beyond ‘Gotcha’ Questions: The LRU Cache Challenge for Real-World Skills

C2tC...18qu
24 May 2024
28

The technical interview process is broken. This isn’t a hot take; many people agree with me. As a matter of fact, I’d venture to say that few people would argue it’s working. So why hasn’t it been fixed?
Dall-E 3 representation of critical thinking

The Problem with ‘Gotcha’ Questions

Let’s be honest: many technical interviews rely on ‘gotcha’ questions that do little to assess a candidate’s real-world skills. They suck. If you want proof, read about me falling flat on my face during a Square interview with one of these ‘gotcha’ questions in my previous article, Nailing the Interview: A No-Nonsense Approach to Showcasing Your Talents.
These questions often focus on obscure algorithms or trick questions that test how well someone can memorize textbook solutions rather than how they approach problem-solving in a practical setting. This approach is stressful for candidates and ineffective for employers looking to hire genuinely skilled developers.
So, what’s the alternative? In my opinion, we should trust folks who have a GitHub or other social media presence. Rather than waste their time on LeetCode, let’s spend time interviewing them on critical thinking. But since that’s probably not in the cards yet, even with AI-assisted programming rapidly climbing, we need a better way.
I want to start off by saying I don’t do coding interviews often anymore. I’m usually in the culture fit rounds. But when I do technical interviews, I greatly prefer getting to know the candidate during a systems design or architecture round. It’s much more open-ended, and there doesn’t need to be gotchas to ensure someone knows what’s on their resume.

Why the LRU Cache?

Enter the LRU (Least Recently Used) cache. I love this data structure, not just because it’s elegant and efficient, but because it tests a range of skills that are actually relevant to software development. Edge cases, locking, flow control, structure — all things important to everyday software engineering without complicated overhead that does little more than ensure the candidate paid their penance to LeetCode for weeks before they had the honor of gracing your conference room. And to those who actually feel like the technical software interview is a rite of passage: Get over yourself.
Implementing an LRU cache requires an understanding of both data structures and algorithms, but it also demands creativity, problem-solving, and the ability to write clean, maintainable code.

What Makes the LRU Cache Challenge Effective

The LRU cache challenge is a fantastic interview question for several reasons:

  1. Relevance: Caching is widely used in real-world applications, from web browsers to databases. Understanding how to implement an LRU cache demonstrates practical knowledge that will be useful on the job.
  2. Complexity: While the LRU cache is not trivial, it’s also not so complex that it becomes a ‘gotcha’ question. It strikes the right balance, allowing candidates to showcase their skills without feeling overwhelmed.
  3. Problem-Solving: Implementing an LRU cache requires candidates to think critically about how to manage data efficiently. This is a real-world problem that developers face regularly, making it a great way to assess their problem-solving abilities.
  4. Code Quality: Because the implementation involves both data structures and algorithms, it provides a window into how candidates write and organize their code. Are they using clear, descriptive variable names? Is their code modular and easy to read? These are important factors in determining a candidate’s fit for a role.

A Step-by-Step Guide to the Technical Coding Challenge

For this section, I’m going to use examples written in Go. It’s my favorite language, followed closely by Ruby. No need to start an emacs vs vim flame war, and for those who’ve only used VSCode: Get Off My Cloud! jk 😄
I’d also like to think this section is useful for any technical coding challenge interview.

  1. Understanding the Problem and Scope: One of my favorite skills in candidates is curiosity. Questions like “Why do we need an LRU Cache?” would light up my eyes and definitely earn the candidate some brownie points. Understanding the use cases and edge cases is crucial. The answer might be, “We need a package generic enough for the entire company,” or “We want to cache a set of strings in the key-value store.” These questions matter. Knowing what you’re trying to build is essential, not just in an interview but throughout your career. Once I understand the problem and scope, I can expand my knowledge to code that solves the problem. You might even suggest a better solution once you grasp the use cases. If the interviewer says they want to store assets at the edge like a Content Delivery Network (CDN), maybe you suggest an LFU (Least Frequently Used) cache as a better solution.
  2. Outlining the Basics: Start by explaining what an LRU cache is and why it’s useful. At this point, I’d also start writing out the supporting code. For example, in Go, I’d start writing out the interfaces, like this:
package cache

import (
 "time"
)

// A Cache is a generalized interface to a cache.  See cache.LRU for a specific
// implementation (bounded cache with LRU eviction)
type Cache interface {
 // Get retrieves an element based on a key, returning nil if the element
 // does not exist
 Get(key string) interface{}

 // Put adds an element to the cache, returning the previous element
 Put(key string, value interface{}) interface{}

 // Delete deletes an element in the cache
 Delete(key string)
}

// Options control the behavior of the cache
type Options struct {
 // TTL controls the time-to-live for a given cache entry.  Cache entries that
 // are older than the TTL will not be returned
 TTL time.Duration

 // InitialCapacity controls the initial capacity of the cache
 InitialCapacity int

 // RemovedFunc is an optional function called when an element
 // is scheduled for deletion
 RemovedFunc RemovedFunc
}

// RemovedFunc is a type for notifying applications when an item is
// scheduled for removal from the Cache. If f is a function with the
// appropriate signature and i is the interface{} scheduled for
// deletion, Cache calls go f(i)
type RemovedFunc func(interface{})

3 Data Structures Involved: Discuss the key data structures needed for the implementation, such as hash maps and doubly linked lists. Maybe at this point you would start writing out your struct that will eventually satisfy that interface.

package cache


import (
 "container/list"
 "sync"
 "time"
)


// lru is a concurrent fixed size cache that evicts elements in lru order
type lru struct {
 mut      sync.Mutex
 byAccess *list.List
 byKey    map[string]*list.Element
 maxSize  int
 ttl      time.Duration
 rmFunc   RemovedFunc
}

type cacheEntry struct {
 key        string
 expiration time.Time
 value      interface{}
}

// New creates a new cache with the given options
func New(maxSize int, opts *Options) Cache {
 if opts == nil {
  opts = &Options{}
 }

 return &lru{
  byAccess: list.New(),
  byKey:    make(map[string]*list.Element, opts.InitialCapacity),
  ttl:      opts.TTL,
  maxSize:  maxSize,
  rmFunc:   opts.RemovedFunc,
 }
}


// NewLRU creates a new LRU cache of the given size, setting initial capacity
// to the max size
func NewLRU(maxSize int) Cache {
 return New(maxSize, nil)
}

// NewLRUWithInitialCapacity creates a new LRU cache with an initial capacity
// and a max size
func NewLRUWithInitialCapacity(initialCapacity, maxSize int) Cache {
 return New(maxSize, &Options{
  InitialCapacity: initialCapacity,
 })
}

4.
Algorithmic Approach: Outline the steps required to implement the LRU cache, including how to handle cache hits and misses. This is where you start outlining the edge cases and how they’d be handled. What happens during a cache hit? How does the structure change to support this recently used cache and prevent it from being deleted? How will you approach locking? These are important to give the interviewer an understanding of what you’re considering and how you think. In my opinion, this is the opportunity for you to ensure that you’re on the right track and keep you aligned with the goals of the use case.
5.
Code Implementation: Provide a simple, clean implementation of the LRU cache in your favorite language. For this particular type of coding challenge, I’ll always choose Go. Start by writing out the key functions and then fleshing those out as it makes sense.
Here’s an example LRU Cache structure! I’ve used it numerous times at various companies. To see the production ready library, check it out on GitHub.

// Get retrieves the value stored under the given key
func (c *lru) Get(key string) interface{} {

}

// Put puts a new value associated with a given key, returning the existing value (if present)
func (c *lru) Put(key string, value interface{}) interface{} {

}

// Delete deletes a key, value pair associated with a key
func (c *lru) Delete(key string) {

}

6.
Testing and Optimization: Discuss how to test the implementation and potential optimizations. Start building a test harness if you have time. If you’re at the cusp between an L4/L5 (L5 being senior), then tests will absolutely push you across the line. Folks that add testing suites that pass will, 99% of the time, get a strong yes from me.
The technical interview process may be broken, but it doesn’t have to stay that way. By focusing on real-world challenges like the LRU cache, we can create a more effective, fair, and insightful interview process that benefits both candidates and employers. So, next time you’re preparing for an interview or designing one, consider ditching the ‘gotcha’ questions in favor of something that truly tests a candidate’s skills and potential.
Have you ever faced an LRU cache challenge in an interview? What are your thoughts on ‘gotcha’ questions versus real-world problem-solving? Share your experiences and insights in the comments below. And if you found this article helpful, please share it with your network. Let’s work together to make technical interviews better for everyone.

Write & Read to Earn with BULB

Learn More

Enjoy this blog? Subscribe to dafeot

0 Comments

B
No comments yet.
Most relevant comments are displayed, so some may have been filtered out.