The Spring Framework’s legacy in the Java ecosystem is undeniable. Recognized for its powerful architecture, versatility, and constant growth, Spring remains at the forefront of Java development. The release of Spring Framework 6.x heralds a new era, with enhanced features and revisions that cater to the modern developer’s needs.
Spring Framework 6.x sets a new standard by updating the prerequisites for various libraries, ensuring better compatibility and performance:
The beating heart of the Spring Framework has undergone significant transformations:
Let’s break down the changes and provide examples to illustrate the transition from using java.net.URL constructors to a consistent URI-based resolution in JDK 20.
Before JDK 20, developers often used java.net.URL constructors to create URLs. For example:
URL url = new URL("https://www.example.com");
If there was a need to resolve a relative path against a base URL, it could be done like this:
URL baseUrl = new URL("https://www.example.com/base/");
URL relativeUrl = new URL(baseUrl, "relativePath");
In this example, relativeUrl would translate to https://www.example.com/base/relativePath.
In JDK 20, with the deprecation of java.net.URL constructors, the emphasis is on URI for resolution, which provides a more consistent approach, especially when handling relative paths.
To create a new URL, you can leverage URI and then convert it to a URL:
URI uri = new URI("https://www.example.com");
URL url = uri.toURL();
For relative path resolution:
URI baseUri = new URI("https://www.example.com/base/");
URI relativeUri = baseUri.resolve("relativePath");
URL relativeUrl = relativeUri.toURL();
Here, relativeUrl would again be https://www.example.com/base/relativePath.
Consider an uncommon scenario where a full URL is specified as a relative path:
With java.net.URL Constructors:
URL baseUrl = new URL("https://www.example.com/base/");
URL fullUrlAsRelative = new URL(baseUrl, "https://www.differentdomain.com/relativePath");
Previously, fullUrlAsRelative would resolve to https://www.differentdomain.com/relativePath.
URI baseUri = new URI("https://www.example.com/base/");
URI fullUriAsRelative = baseUri.resolve("https://www.differentdomain.com/relativePath");
URL fullUrlAsRelative = fullUriAsRelative.toURL();
With the URI-based resolution, fullUrlAsRelative would still resolve to https://www.differentdomain.com/relativePath, maintaining consistency in behavior.
In conclusion, the transition to URI-based resolution in JDK 20 provides a more consistent and reliable approach for URL creation and relative path resolution, even in uncommon cases.
In Spring Framework 6.1, the method AutowireCapableBeanFactory.createBean(Class, int, boolean) is deprecated. It is recommended to use the simpler, convention-based method createBean(Class). This new method is consistently employed internally in version 6.1, especially in classes such as SpringBeanJobFactory for Quartz integration and SpringBeanContainer for Hibernate integration.
Here are the examples to illustrate this change:
Suppose you want to create a bean of class MyBean:
AutowireCapableBeanFactory factory = applicationContext.getAutowireCapableBeanFactory();
// Using the deprecated createBean method
MyBean myBean = (MyBean) factory.createBean(MyBean.class, AutowireCapableBeanFactory.AUTOWIRE_BY_TYPE, true);
Here, the method createBean requires three arguments: the bean class, autowiring mode, and a boolean flag for dependency check.
With the new convention-based method, the process is more straightforward:
AutowireCapableBeanFactory factory = applicationContext.getAutowireCapableBeanFactory();
// Using the convention-based createBean method
MyBean myBean = factory.createBean(MyBean.class);
In this updated approach, the method infers the autowiring mode and dependency check based on conventions, leading to cleaner and more intuitive code.
For the internal workings of Spring 6, classes such as SpringBeanJobFactory and SpringBeanContainer now use the createBean(Class) method for bean creation.
For instance:
Previously, when a job is triggered in Quartz, it might have used the longer createBean method to instantiate the required job beans. Now, it would utilize the createBean(Class) convention for this purpose.
Similarly, when Hibernate needs to instantiate an entity listener or any other custom beans, the SpringBeanContainer would leverage the createBean(Class) method.
In essence, the adoption of the createBean(Class) method streamlines bean creation in Spring 6, both for developers and within the framework’s internal operations.
In Spring Framework 6, there’s a significant change regarding the conversion of arrays to collections. Previously, the exact type of collection resulting from such a conversion could vary. Now, when converting an array to a Collection target type, the framework consistently returns a List.
Suppose you have an array of strings:
String[] stringArray = {"apple", "banana", "cherry"};
And you want to convert it to a collection using Spring’s conversion service. The result might have been any implementer of the Collection interface based on internal conditions:
ConversionService conversionService = DefaultConversionService.getSharedInstance();
Collection<String> stringCollection = conversionService.convert(stringArray, Collection.class);
// The resultant stringCollection could be any type of Collection (e.g., Set, Queue, etc.)
With the update in Spring Framework 6, converting the same array to a Collection will consistently return a List:
String[] stringArray = {"apple", "banana", "cherry"};
ConversionService conversionService = DefaultConversionService.getSharedInstance();
Collection<String> stringCollection = conversionService.convert(stringArray, Collection.class);
// The resultant stringCollection is now guaranteed to be of type List
if (stringCollection instanceof List) {
System.out.println("The converted collection is a List!");
}
This change ensures that developers can expect a consistent type (List) when performing array-to-collection conversions, eliminating any uncertainty associated with the target collection type.
In the updated Spring Framework, two components, ThreadPoolTaskExecutor and ThreadPoolTaskScheduler, have been enhanced to ensure a more seamless shutdown process when closing the application context. By default, these components will not accept any new task submissions during the shutdown phase. However, for situations where task submissions are still required during shutdown, there’s an option to adjust a flag, though it might lead to a prolonged shutdown phase.
When the application context begins its closure process:
ApplicationContext context = new AnnotationConfigApplicationContext(MyAppConfig.class);
// ... your application logic here ...
context.close();
Both ThreadPoolTaskExecutor and ThreadPoolTaskScheduler will enter a graceful shutdown mode:
ThreadPoolTaskExecutor taskExecutor = context.getBean(ThreadPoolTaskExecutor.class);
// This will throw an exception as new tasks cannot be submitted during shutdown by default
taskExecutor.execute(() -> System.out.println("New Task!"));
If you need to allow task submissions during the context’s closure, adjust the acceptTasksAfterContextClose flag:
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setAcceptTasksAfterContextClose(true);
ThreadPoolTaskScheduler taskScheduler = new ThreadPoolTaskScheduler();
taskScheduler.setAcceptTasksAfterContextClose(true);
By setting the flag to true, the components can still accept new tasks even during the shutdown. But keep in mind, enabling this flag may prolong the time it takes for the application context to completely close.
The ApplicationContext in the updated Spring Framework has seen improvements in its message resolution process. Now, attempts to retrieve messages from its internal MessageSource are restricted to when the context is actively running. If there are attempts to fetch messages after the context has been closed, an exception will be raised.
When the ApplicationContext is active, retrieving messages works seamlessly:
ApplicationContext context = new AnnotationConfigApplicationContext(MyAppConfig.class);
String message = context.getMessage("welcome.message", null, Locale.US);
System.out.println(message); // Outputs the corresponding welcome message
However, if you try to access the message after the context has been closed, it will result in an exception:
ApplicationContext context = new AnnotationConfigApplicationContext(MyAppConfig.class);
context.close();
// This will throw an IllegalStateException since the context is no longer active
String message = context.getMessage("farewell.message", null, Locale.US);
In essence, the changes ensure that developers are only fetching messages when the context is in the right state, leading to more predictable and consistent behavior.
While creating a native image in Spring, the detailed logs concerning pre-computed fields are now off by default. However, if you want to view these logs, you can activate them using the -Dspring.native.precompute.log=verbose argument during the compilation process.
When you build a native image without any additional arguments, you won’t see the verbose logs related to pre-computed fields:
$ native-image -jar myApp.jar
To get a detailed view of the pre-computed fields while building the native image, add the specific argument:
$ native-image -jar myApp.jar -Dspring.native.precompute.log=verbose
Upon executing this, the compiler will display the verbose logs related to pre-computed fields, offering more insights during the image construction process.
Data operations and transactions get a boost with strategic enhancements:
Web development with Spring sees groundbreaking changes:
Spring MVC and WebFlux have introduced a more enhanced method validation for controller inputs, ensuring better data quality. This validation specifically targets controller method parameters using @Constraint annotations.
In previous setups, you might have had a controller like:
@RestController
@Validated
public class MyController {
@PostMapping("/endpoint")
public ResponseEntity<?> processData(@Valid MyRequestBody body) {
// process data
}
}
With this setup, validations were typically applied at the argument resolver level.
Now, with the new method validation:
@RestController
public class MyController {
@PostMapping("/endpoint")
public ResponseEntity<?> processData(@Constraint(MyConstraint.class) String input, @Valid MyRequestBody body) {
// process data
}
}
In this example:
To fully utilize this, you should:
Remember, this avoids the risk of double validation and centralizes the validation process for better consistency and clarity.
The focus is clear: heightened security and superior functionality:
For Message-Driven Applications, there are significant enhancements in security and functionality. The RSocket interface client has changed its default timeout approach, relying more on the RSocket client’s settings. Also, to boost security, SpEL expressions evaluation from questionable sources has been disabled by default, especially in WebSocket messaging.
Previously, there might have been a 5-second default timeout on certain methods. This has been altered:
// Older Approach: The RSocket interface client had its own default timeout.
RSocketRequester requester = ...;
String response = requester.route("some.route").data("request").retrieveMono(String.class).block();
// New Approach: The timeout now depends on the configuration of the RSocket client and its transport.
RSocketRequester.Builder builder = ...;
RSocketRequester requester = builder.connectTcp("localhost", 7000).block();
The change provides more flexibility, as the timeout behavior is primarily dictated by the RSocket client and its transport settings. See reference issue #30248.
To improve security, SpEL expressions evaluation from untrusted sources is turned off by default in WebSocket messaging. If you need the SpEL-based selector header support, it needs to be explicitly enabled:
@Configuration
@EnableWebSocketMessageBroker
public class WebSocketConfig implements WebSocketMessageBrokerConfigurer {
@Override
public void configureMessageBroker(MessageBrokerRegistry registry) {
// Explicitly enabling the selector header
registry.enableSimpleBroker().setSelectorHeaderName("selector");
}
}
This means that by default, applications won’t evaluate potentially harmful SpEL expressions, especially in WebSocket scenarios. If needed, developers can opt-in with the above configuration. See reference issue #30550
Transitioning from Spring Framework 5.x to 6.x? Here’s a concise migration guide:
In recent changes to the Spring framework, specific annotations have undergone a transition to new namespaces, reflecting broader shifts in the Java ecosystem. These transitions are fundamental and crucial for developers to comprehend and adapt to, ensuring the smooth operation of their applications.
In response to the modifications in JSR-330 and JSR-250 standards, the annotations, including @Inject, have been moved to the jakarta.inject namespace. Similarly, other prominent annotations like @PostConstruct and @PreDestroy have been transitioned to jakarta.annotation.
Consider a previous implementation:
import javax.inject.Inject;
import javax.annotation.PostConstruct;
public class SampleService {
@Inject
private SomeDependency someDependency;
@PostConstruct
public void init() {
// Initialization logic
}
}
In the updated scenario, the imports would shift to:
import jakarta.inject.Inject;
import jakarta.annotation.PostConstruct;
It’s worth noting that, for the time being, Spring will continue detecting the old javax equivalents, which is beneficial for those applications relying on pre-compiled binaries.
The core container of Spring has undergone a significant shift in how it identifies bean properties. Traditionally, it relied on the default java.beans.Introspector. However, recent updates showcase a deviation from this norm.
Transition from java.beans.Introspector
Spring’s core container now determines basic bean properties without resorting to the default java.beans.Introspector. This change aims to enhance efficiency but can lead to disparities for those accustomed to the 5.3.x version and its intricate JavaBeans usage.
For those who wish to maintain full compatibility with the 5.3.x version, there’s a provision to revert to the older style. By specifying the content org.springframework.beans.BeanInfoFactory=org.springframework.beans.ExtendedBeanInfoFactory in a META-INF/spring.factories file, users can enable the full utilization of java.beans.Introspector as was the case in version 5.3.
On the flip side, users who are still on 5.3.x but wish to experience the improved introspection performance of the 6.0-style property determination can do so. This can be achieved by inserting org.springframework.beans.BeanInfoFactory=org.springframework.beans.SimpleBeanInfoFactory in a custom META-INF/spring.factories file.
Other Notable Changes
The transition from javax annotations to jakarta annotations is evident with the core container now detecting @Inject in jakarta.inject and both @PostConstruct and @PreDestroy in jakarta.annotation. For now, Spring also recognizes the javax equivalents.
LocalVariableTableParameterNameDiscoverer is on its way out, being deprecated. Any successful resolution attempt by it will trigger a warning. To evade this warning, users are advised to compile Java sources using the Java 8+ -parameters flag rather than the -debug compiler flag.
LocalValidatorFactoryBean has updated its dependency to rely on standard parameter name resolution under Bean Validation 3.0. This configuration also considers additional Kotlin reflection if Kotlin is detected in the system. Hence, for parameter names in Bean Validation setups, it’s recommended to compile Java sources with the Java 8+ -parameters flag.
A shift in preference from ListenableFuture to CompletableFuture is evident, with the former being deprecated.
The core container has now implemented a strict checking mechanism for methods with the @Async annotation. They must return either Future or void, and this rule is now actively enforced.
Lastly, SimpleEvaluationContext has made changes that disable array allocations. This change is in line with the regular constructor resolution practices.
In summary, these transformations reflect Spring’s continuous efforts to optimize its core container, ensuring it remains robust and efficient for developers.
The evolution of concurrent task handling in software applications has seen a pivotal shift. One of the significant changes in recent times is the migration from ListenableFuture to CompletableFuture. This transition isn’t just a mere replacement of one library for another. Instead, it epitomizes an effort to embrace modern capabilities and ensure efficiency in concurrent operations.
Historically, the Spring framework has employed ListenableFuture for managing asynchronous computation tasks. This approach provided a mechanism to register callbacks that would execute once the asynchronous task was completed. While useful, the capabilities of ListenableFuture were limited compared to the more contemporary CompletableFuture.
CompletableFuture offers a more robust and flexible API for asynchronous programming. It doesn’t just allow the registration of callbacks but also supports combining multiple asynchronous computations, thus enabling developers to chain tasks seamlessly. Its non-blocking nature further ensures that resources are used optimally, reducing the overhead and potential for bottlenecks.
For those still reliant on ListenableFuture, it’s crucial to understand its deprecation. The recommendation is to transition to CompletableFuture to leverage its advanced capabilities. This shift has been highlighted in notable Spring updates, such as reference issue #27780
As we stand on the precipice of technological evolution in the Java ecosystem, Spring Framework 6.x shines as a beacon of innovation, setting new standards in software development. It’s more than just an upgrade; it encapsulates Spring’s vision of continuous innovation, enhanced security, and equipping developers with futuristic tools. With each iteration, the framework has consistently demonstrated an unyielding commitment to enhancing the developer experience, streamlining integrations, fortifying security, and fostering a vibrant development environment. Recognizing the changes and their implications is crucial for smooth integration. Let’s summarize the monumental strides this latest release takes and how it paves the way for the future, urging developers to harness these novelties and architect state-of-the-art applications.
Dovetailing with modern technologies ensures the framework remains contemporary and pertinent. The upgrades to essential libraries like SnakeYAML, Jackson, and Kotlin (both Coroutines and Serialization) are not mere incremental changes but strategic decisions to keep the ecosystem vibrant, efficient, and forward-compatible.
The meticulous refinements in the core container showcase Spring’s dedication to adaptability. From a more seamless URL resolution mechanism aligning with JDK 20’s developments to the modern createBean(Class) method for intuitive bean creation, Spring Framework 6.x demonstrates an unwavering focus on improving the foundational components.
By addressing and enhancing user experience subtleties, like clearer error messaging in JPA bootstrapping and a more intuitive exception handling mechanism, the framework strengthens its commitment to seamless data access and robust transactional integrity.
With groundbreaking enhancements to both Spring MVC and WebFlux, Spring Framework 6.x is shaping the future of web development. The thoughtful refinements, ranging from enhanced validation for controller parameters to the rejuvenation of HTTP client-server interfaces, underscore a vision for a web that’s more responsive, secure, and developer-friendly.
In an age where security is paramount, Spring’s strategic decisions to fortify its messaging applications — like the recalibration of the RSocket interface client and the pro-security move to disable certain SpEL expressions by default — resonate with the needs of contemporary applications.
Migration to Spring Framework 6.x, though promising, necessitates an understanding of its nuances. It’s pivotal to recognize the relocations of pivotal annotations and the shift in mechanisms like bean property determination. The subtle nudge towards CompletableFuture from ListenableFuture further epitomizes Spring’s vision of embracing modernity.
In essence, Spring Framework 6.x isn’t merely an upgrade — it’s a testament to the framework’s evolutionary spirit, consistently pushing the envelope in the realms of innovation, security, and efficiency. It heralds a new era for Java developers, offering a suite of tools and enhancements tailored for the challenges and opportunities of tomorrow. Embracing this release is not just about leveraging its features but about aligning with a vision of progressive, secure, and efficient software development. As developers and technology enthusiasts, the call is clear: to welcome, understand, and harness the power of Spring Framework 6.x to shape the future of robust, scalable, and innovative applications.