Have a task – count a large text file, filter the necessary data from the presented, make some manipulation with them and create a new file with these corrected data. Tell me, what way advise you to read large files? I use the bufferedreader-Ohm, but I use the plug (small file) yet, and I do not know whether this method will draw a large file on 10K lines. Maybe advise more efficient reading methods?
Answer 1, Authority 100%
Try this:
import java.io.ioException;
Import java.nio.file.files;
Import java.nio.file.paths;
Import java.util.stream.collectors;
Import java.util.stream.stream;
Public Class Test {
Public Static Void Main (String [] Args) Throws IoException {
String InputFileName = "";
String OutputFileName = "";
Try (Stream & LT; String & GT; Stream = Files.lines (Paths.get (InputFileName))) {
Files.write (Paths.get (OutputFileName),
Stream
.map (Line- & gt; TransformLine (Line))
.collect (Collectors.joining ("\ n")). GetBytes ());
}
}
Public Stator String TransformLine (String Line) {
// TODO.
Return Line;
}
}