Java’s Streams API and functional interfaces offer powerful tools to process files and I/O data in a clean, declarative way. Instead of writing complex loops and managing resources manually, you can leverage stream pipelines to read, transform, and aggregate file content efficiently and safely.
Files.lines()
The Files.lines(Path)
method returns a lazy stream of lines from a file. This allows processing large files without loading the entire content into memory at once.
Suppose we want to process a text file to:
Here’s how this can be done functionally:
import java.nio.file.*;
import java.io.IOException;
import java.util.stream.Stream;
public class FileProcessingExample {
public static void main(String[] args) {
Path filePath = Paths.get("example.txt");
String keyword = "java";
// Use try-with-resources for automatic resource management
try (Stream<String> lines = Files.lines(filePath)) {
long count = lines
.filter(line -> line.toLowerCase().contains(keyword))
.map(String::toUpperCase)
.peek(System.out::println) // Print each matching line
.count();
System.out.println("Total lines containing '" + keyword + "': " + count);
} catch (IOException e) {
System.err.println("Error reading file: " + e.getMessage());
}
}
}
filter
and map
describe what to do rather than how.try-with-resources
ensures the file stream is closed automatically.try-catch
block outside the stream.Since Java streams do not natively support checked exceptions in lambdas, you can either:
Streams allow aggregating data easily:
// Example: Count occurrences of words starting with "f"
try (Stream<String> lines = Files.lines(filePath)) {
long count = lines
.flatMap(line -> Stream.of(line.split("\\s+")))
.filter(word -> word.startsWith("f"))
.count();
System.out.println("Words starting with 'f': " + count);
}
By combining Java’s Streams API with file I/O, you can write concise, readable, and efficient code for processing text files. The functional approach encourages immutability, lazy evaluation, and easy composition of operations, while try-with-resources handles resource management seamlessly. This leads to safer, more maintainable file-processing code.
Parsing and transforming JSON data is a common task in modern Java applications, especially when dealing with APIs or configuration files. While popular libraries like Jackson and Gson handle JSON serialization and deserialization, you can integrate functional programming techniques to process and transform JSON data in a clean, declarative style.
Jackson is one of the most widely used JSON libraries in Java. It can convert JSON into Java objects (POJOs) or generic structures like JsonNode
. Combined with Java Streams and lambdas, you can process JSON collections and transform data efficiently.
Suppose you have a JSON array of users, and you want to parse it, filter users by age, and transform their names to uppercase.
import com.fasterxml.jackson.databind.*;
import java.util.*;
import java.util.stream.*;
public class JsonFunctionalExample {
public static void main(String[] args) throws Exception {
String json = """
[
{"name": "Alice", "age": 30},
{"name": "Bob", "age": 22},
{"name": "Charlie", "age": 25}
]
""";
ObjectMapper mapper = new ObjectMapper();
// Deserialize JSON array into List<User>
List<User> users = Arrays.asList(mapper.readValue(json, User[].class));
// Use stream to filter and transform user names
List<String> names = users.stream()
.filter(user -> user.age >= 25)
.map(user -> user.name.toUpperCase())
.collect(Collectors.toList());
names.forEach(System.out::println); // ALICE, CHARLIE
}
static class User {
public String name;
public int age;
// Default constructor needed by Jackson
public User() {}
}
}
Jackson also supports a tree model (JsonNode
), which allows functional traversal and transformations without full binding.
JsonNode root = mapper.readTree(json);
List<String> filteredNames = StreamSupport.stream(root.spliterator(), false)
.filter(node -> node.get("age").asInt() >= 25)
.map(node -> node.get("name").asText().toUpperCase())
.collect(Collectors.toList());
filteredNames.forEach(System.out::println); // ALICE, CHARLIE
Gson can deserialize JSON similarly, allowing you to use streams for post-processing:
import com.google.gson.*;
import com.google.gson.reflect.TypeToken;
Gson gson = new Gson();
List<User> users = gson.fromJson(json, new TypeToken<List<User>>(){}.getType());
List<String> names = users.stream()
.filter(u -> u.age >= 25)
.map(u -> u.name.toUpperCase())
.collect(Collectors.toList());
Functional programming enhances JSON parsing and transformation by:
Combining powerful JSON libraries with functional patterns results in clean, expressive, and maintainable data-processing code.
Parsing CSV files is a common task when dealing with tabular data. Using Java Streams and functional interfaces, we can build a clean, concise CSV parser that reads lines from a file, splits fields, maps them to domain objects, filters invalid rows, and collects results—all in a declarative style.
import java.nio.file.*;
import java.io.IOException;
import java.util.*;
import java.util.stream.*;
public class CsvParserExample {
// Domain class representing a Person
static class Person {
String name;
int age;
String email;
Person(String name, int age, String email) {
this.name = name;
this.age = age;
this.email = email;
}
@Override
public String toString() {
return name + " (" + age + "), " + email;
}
}
public static void main(String[] args) {
Path csvFile = Paths.get("people.csv");
try (Stream<String> lines = Files.lines(csvFile)) {
List<Person> people = lines
// Skip header line (assuming first line is headers)
.skip(1)
// Split each line by comma into String array
.map(line -> line.split(","))
// Filter out invalid lines (e.g., wrong number of fields)
.filter(fields -> fields.length == 3)
// Map fields to Person objects, parsing age as int
.map(fields -> {
try {
String name = fields[0].trim();
int age = Integer.parseInt(fields[1].trim());
String email = fields[2].trim();
return new Person(name, age, email);
} catch (NumberFormatException e) {
// Skip lines with invalid age
return null;
}
})
// Filter out nulls from failed parses
.filter(Objects::nonNull)
// Collect into a list
.collect(Collectors.toList());
people.forEach(System.out::println);
} catch (IOException e) {
System.err.println("Failed to read CSV file: " + e.getMessage());
}
}
}
people.csv
Filename,age,email
Alice,30,alice@example.com
Bob,notanumber,bob@example.com
Charlie,25,charlie@example.com
Dana,40,dana@example.com
Files.lines(csvFile)
lazily reads the file line by line..skip(1)
ignores the CSV header row..map(line -> line.split(","))
transforms each line into a String array.Person
objects; returns null
on parse failure.Person
objects into a list.This simple CSV parser illustrates how Java Streams and lambdas can be combined for clean, functional data processing pipelines.