Tuesday, July 8, 2025

Java Memory Management and Garbage Collection

Java has always been renowned for its automatic memory management. Developers can focus more on writing business logic while the Java Virtual Machine (JVM) handles memory allocation and garbage collection under the hood. However, as applications scale and performance becomes critical, understanding how memory management and garbage collection (GC) work becomes essential.

Java Memory Model: The Basics

Java memory model is built around a few core memory areas managed by the JVM: 

1. Heap Memory

  • Young Generation: Contains short-lived objects. Includes Eden and two Survivor spaces.

  • Old Generation: Stores long-lived objects.

2. Stack Memory

  • Used for method invocations and local variables.

  • Each thread has its own stack.

3. Metaspace (formerly Permanent Generation)

  • Stores class metadata and static methods. Introduced in Java 8.

4. Program Counter and Native Method Stack

  • The program counter keeps track of the executing instruction.

  • Native method stacks support native (non-Java) methods. 


How Garbage Collection Works

Garbage Collection in Java identifies objects that are no longer reachable from the application and reclaims the memory. The main steps include:

  • Reachability Analysis: Traces references from GC roots (local variables, static fields, etc.).Mark Phase: Marks all reachable objects.
  • Sweep/Relocate Phase: Deletes unmarked objects and optionally compacts the heap.
  • Promotion: Objects surviving multiple GC cycles in the Young Generation are promoted to the Old Generation. 

GC can be categorised into:

  • Minor GC: Cleans the Young Generation.
  • Major (Full) GC: Cleans the entire heap.


Evolution of Garbage Collection in Java

Java 1.0 - 1.3

  • Basic mark-and-sweep collector.

Java 1.4 - 5

  • Introduced Generational GC.

  • Concurrent Mark Sweep (CMS) introduced for low pause times.

Java 6 - 7

  • Improved tuning and parallelism in collectors.

Java 8

  • G1 GC gains maturity.

  • Metaspace replaces Permanent Generation.

Java 9 - 10

  • G1 GC becomes the default.

Java 11 - 14

  • ZGC and Shenandoah GC introduced for ultra-low pause times.

Java 15 - 17

  • ZGC becomes stable.

  • CMS deprecated and removed.

Java 21

  • Generational ZGC introduced as a preview.

ZGC vs. Generational ZGC

ZGC

  • Non-generational, single-heap design.

  • Sub-millisecond pause times.

  • Scans the entire heap.

Generational ZGC (Java 21+)

  • Divides heap into Young and Old generations.
  • Uses minor GCs to clean Young Generation quickly.
  • Improves throughput while maintaining low latency.

Feature

ZGC

Generational ZGC

Heap Layout

Flat

Young + Old Gen

Pause Times

Low

Even lower

Throughput

Moderate

Higher

Java Version

11+

21+ (Preview)


Choosing the Right Garbage Collector

Key Factors to Consider:

  1. Latency Requirements

    • Real-time apps: Use ZGC, Shenandoah, or G1.

  2. Heap Size

    • Small (<4 GB): Serial or G1 GC

    • Large (>32 GB): G1, ZGC, Shenandoah

  3. Throughput Needs

    • Batch jobs: Parallel GC

  4. Startup Time

    • Serial GC for fast-starting small apps

  5. Resource Constraints

    • In containers, prefer G1 or ZGC

  6. JDK Version

    1. Use GC supported by your JDK (e.g., Generational ZGC in Java 21+)


Flowchart: Quick Decision

 


 

Conclusion

Java’s memory management and garbage collection have evolved tremendously, making it easier to build scalable and performant applications. With the arrival of Generational ZGC in Java 21, developers now have access to a GC that balances ultra-low latency and high throughput.

Choosing the right GC depends on your application’s goals—whether it's minimising pause times, maximising throughput, or optimising for constrained environments. Understanding these tools helps you get the most out of your JVM.

 

 

Monday, May 5, 2025

Consequences of high volume traffic

Let’s consider we have a Java service running on a single server. Clients can make requests to the service via REST API calls. The REST API implementations may contain DB operations, use of multiple threads, calls to external APIs, operations with high memory usages etc. What will be consequences if there is a huge spike in the requests?  

1. Database Overload: The database connection pool could be overloaded due to a sudden spike. A database connection pool overload occurs when too many connections are requested from a pool, either due to a large number of concurrent users or inefficient application code. This can lead to: 

  • Slow response times: Applications become unresponsive or take a long time to complete requests.
  • Database overload: The database server may become overwhelmed, leading to resource contention and reduced throughput.
  • Application crashes: Applications may crash due to an inability to acquire a database connection.
  • Database outages: In severe cases, the database server may become unavailable.  

How to address overload:

  • Optimize application code: Ensure that connections are properly closed after use and that queries are efficient. 
  • Adjust pool size: Increase the pool size to accommodate peak demand, but be mindful of database resources. 
  • Implement connection management: Use tools like connection pooling libraries or middleware to manage connections more effectively. 
  • Monitor database performance: Use monitoring tools to track connection pool usage and identify potential bottlenecks. 
  • Implement connection rate limits: Limit the number of connections an application can establish to prevent overwhelming the database. 
  • Consider database-level limits: Some databases offer features to limit the number of concurrent connections.  

2. Increased CPU and Memory Usage:  If the application is not optimized, a surge in requests can cause high CPU and memory consumption, potentially leading to slow response times or crashes. Resources how to monitor and manage high CPU and memory usage:

3. Network Latency and Timeout Issues: High traffic can overwhelm network bandwidth, increasing response times or causing timeouts if external services (like APIs or payment gateways) are involved.

4. Garbage Collection (GC) Pressure: High memory allocation due to excessive object creation may trigger frequent garbage collection, causing application pauses and affecting performance.

5. Thread Contention and Blocking Issues: If your application relies on synchronized methods or database connections with limited threads, multiple requests may get stuck waiting, leading to bottlenecks. Resource how to monitor thread connections:

https://medium.com/@jaehnpark/how-to-define-java-thread-contention-87196c447e12  

 

How to mitigate:

  • Implement caching (like Redis) to reduce database load. Caching is useful for repeated fetch requests. There should be a proper cache eviction policy so that the cache itself is not overloaded.
  • Use asynchronous processing (CompletableFuture, ExecutorService) to handle requests efficiently.
  • Keep the requests in queue before doing the actual processing. The queue can be either in-memory or external.
  • Optimize thread management using a proper thread pool configuration.
  • Use rate limiting (e.g., API Gateway, Bucket4j) to prevent overload.
  • Start to scale the system by adding more servers.  We can scale the system horizontally by using load balancers. 

 

 


Saturday, March 30, 2019

Java Stream Concepts

Streams are Monads. In functional programming, a monad is a structure that represents computations defined as sequences of steps. Java Stream doesn’t store data, it operates on the source data structure (like collection and array) and produce pipelined data. On that pipelined (similar to assembly line/conveyor belt) data we can perform specific operations. Steam makes bulk processing on collections convenient and fast.

Stream operations are either intermediate or terminal. Intermediate operations return a stream so we can chain multiple intermediate operations without using semicolons. Such a chain of stream operations is also known as operation pipeline. An important characteristic of intermediate operations is laziness. Intermediate operations will only be executed when a terminal operation is present. Terminal operations are either void or return a non-stream result.

Java Stream operations use functional interfaces, that makes it a very good fit for functional programming using lambda expression. Most of those functional operations must be both non-interfering and stateless. A function is non-interfering when it does not modify the underlying data source of the stream. A function is stateless when the execution of the operation is deterministic i.e. no lambda expression depends on any mutable variables or states from the outer scope which might change during execution.

Java Streams are consumable, so there is no way to create a reference to stream for future usage. Since the data is on-demand, it’s not possible to reuse the same stream multiple times.

Processing Order
In stream, each element moves along the chain vertically i.e. to the operations are executed vertically one after another on all elements of the stream. Due to this processing order, operations like filters should be placed in beginning of the chain to reduce the number of executions.

Stream Creation
Java stream can be created from array and collections in many ways. Here are some examples:
  • Stream.of() can be used in several ways. We can create stream by providing some elements with specific values.
            Stream stream = Stream.of(1,2,3,4);
    
  • An array of Objects can used to create steam.
            Stream stream = Stream.of(new Integer[]{1,2,3,4});
    
  • Java 8 added a new stream() method to the Collection interface. So, steam can be created from any existing list or other collection.
            List myList = Arrays.asList("a1", "a2", "b1", "c2", "c1");
            Stream stream = myList.stream();
    

Intermediate Operations
Here are some common stream intermediate operations:
  • filter: filter() method takes a Predicate with some condition that is applied to filter the stream. It return a new stream that contains subset of the original stream.
            List myList = Arrays.asList("a1", "a2", "b1", "c2", "c1");
            myList.stream()
                    .filter(s -> s.startsWith("c"))
                    .forEach(System.out::println);
    
  • map: map() method takes a Function and apply it to all elements of the stream.
            List myList = Arrays.asList("a1", "a2", "b1", "c2", "c1");
            myList.stream()
                    .map(String::toUpperCase)
                    .forEach(System.out::println);
    
  • sorted: sorted() method returns a stream consisting of the elements of this stream, sorted according to natural order.
            List myList = Arrays.asList( "b1", "a1", "a2", "c2", "c1");
            myList.stream()
                    .sorted()
                    .forEach(System.out::println);
    
    The overloaded version takes a Comparator as parameter.
            List myList = Arrays.asList( "b1", "a1", "a2", "c2", "c1");
            myList.stream()
                    .sorted(Comparator.reverseOrder())
                    .forEach(System.out::println); 
    
    The Comparator can be provided as lambda expression:
        List myList = Arrays.asList( "b1", "a1", "a2", "c2", "c1");
            myList.stream()
                    .sorted((s1, s2) -> s2.compareTo(s1))
                    .forEach(System.out::println); 
    
  • flatMap: FlatMap transforms each element of the stream into a stream of other objects. So each object will be transformed into zero, one or multiple other objects backed by streams. It helps us to flatten the data structure to simplify further operations.
                Stream< List< String>> namesOriginalList = Stream.of(
                        Arrays.asList("Sajib"),
                        Arrays.asList("Salman", "Anitam"),
                        Arrays.asList("Sazzad"));
               namesOriginalList.flatMap(strList -> strList.stream())
                       .forEach(System.out::println); 
    
In real scenario, usually several intermediate operations are chained.
        List myList = Arrays.asList( "b1", "a1", "a2", "c2", "c1");
        myList.stream()
                .filter(s -> s.startsWith("c"))
                .map(String::toUpperCase)
                .sorted()
                .forEach(System.out::println);

Terminal Operations
Computation on the source data is only performed when the terminal operation is initiated, and source elements are consumed only as needed. All intermediate operations are lazy, so they're not executed until a result of a processing is actually needed. Here are some common stream terminal operations:
  • count: We can use this terminal operation to count the number of items in the stream.
            List myList = Arrays.asList( "b1", "a1", "a2", "c2", "c1");
            System.out.println(myList.stream().count());
    
  • forEach: This can be used for iterating over the stream.
             List myList = Arrays.asList("b1", "a1", "a2", "c2", "c1");
                myList.stream()
                        .forEach(System.out::println);
    
    
  • collect: Collect is an extremely useful terminal operation to transform the elements of the stream into a different kind of result specially collection like List, Set or Map.
            List myList = Arrays.asList("b1", "a1", "a2", "c2", "c1");
            List filtered =myList.stream()
                    .filter(s -> s.startsWith("c"))
                    .collect(Collectors.toList());
            System.out.println(filtered);
    
    
  • findFirst: This is a short circuiting terminal operation. It Returns an Optional describing the first element of this stream, or an empty Optional if the stream is empty.
            List myList = Arrays.asList("b1", "a1", "a2", "c2", "c1");
            Optional filtered = myList.stream()
                    .filter(s -> s.startsWith("c"))
                    .findFirst();
            if (filtered.isPresent()) {
                    System.out.println(filtered.get());            
            }
    

References:

Friday, March 29, 2019

Object Oriented Design Principles

The famous Head First Design Pattern book discusses 9 important Object Oriented Design principles.

Here are the summary of the those principle:
  1. Identify the aspects of your application that vary and separate them from what stays the same. Encapsulate what varies.
  2. Favor composition over inheritance.
  3. Program to an interface, not an implementation.
  4. Strive for loosely coupled designs between objects that interact.
  5. Classes should be open for extension, but closed for modification.
  6. Depend on abstractions. Do not depend on concrete classes (Dependency Inversion Principle). Guidelines to achieve dependency inversion principle:
    • No variable should hold a reference to a concrete class.
    • No class should derive from a concrete class.
    • No method should override an implemented method of any of its base classes.
  7. Principle of Least Knowledge - talk only to your immediate friends.
  8. The Hollywood principle - Don't call us, we'll call you.
  9. A class should have only one reason to change.

Observer Pattern

The Observer pattern defines a one to many dependency between objects so that when one object changes state, all of its dependent are notified and updated automatically.

Subject contains a list of observers to notify of any change in its state, so it should provide methods using which observers can register and unregister themselves. Subject also contain a method to notify all the observers of any change and either it can send the update while notifying the observer or it can provide another method to get the update.

Observer should have a method to set the object to watch and another method that will be used by Subject to notify them of any updates.

Category:
Behavioral design pattern

Example:
A good example can be found here.

Associated design principle:
Strive for loosely coupled designs between objects that interact.

Java’s built-in Observer pattern:
The java.util package has an implementation of observer pattern. The package has a Observer interface and Observable class. This is similar to Subject and Observer interface. The drawback here is Observable is a class not interface.

Here is a UML diagram taken from Head First Design pattern to show usage of java’s build-in Observer pattern:


Pros:
  • Provides a loosely coupled design between objects that interact.
    • Subject only knows that observer implement Observer interface. Nothing more.
    • There is no need to modify Subject to add or remove observers.
    • We can reuse subject and observer classes independently of each other.
  • Allows you to send data to many other objects in a very efficient manner.
  • Follows the Open/Closed Principle.

Cons:
  • Subscribers are notified in random order.

References: