I'm sending files containing binary data from service A to service B. When the number of files is relatively small (let's say 5) everything works well. However, when I try to send more files (let's say several hundred) it sometimes fails. I tried to check what is happening with this binary data, and it looks like WebClient corrupts it in some way (weird padding appears at the end).
I created a minimal reproducible example to illustrate this issue.
Endpoint in service B (consuming binary files):
#RestController
class FilesController {
#PostMapping(value = "/files")
Mono<List<String>> uploadFiles(#RequestBody Flux<Part> parts) {
return parts
.filter(FilePart.class::isInstance)
.map(FilePart.class::cast)
.flatMap(part -> DataBufferUtils.join(part.content())
.map(buffer -> {
byte[] data = new byte[buffer.readableByteCount()];
buffer.read(data);
DataBufferUtils.release(buffer);
return Base64.getEncoder().encodeToString(data);
})
)
.collectList();
}
}
Tests illustrating how the service A sends data:
public class BinaryUploadTest {
private final CopyOnWriteArrayList<String> sentBytes = new CopyOnWriteArrayList<>();
#BeforeEach
void before() {
sentBytes.clear();
}
/**
* this test passes all the time
*/
#Test
void shouldUpload5Files() {
// given
MultiValueMap<String, HttpEntity<?>> body = buildResources(5);
// when
List<String> receivedBytes = sendPostRequest(body);
// then
assertEquals(sentBytes, receivedBytes);
}
/**
* this test fails most of the time
*/
#Test
void shouldUpload1000Files() {
// given
MultiValueMap<String, HttpEntity<?>> body = buildResources(1000);
// when
List<String> receivedBytes = sendPostRequest(body);
// then
assertEquals(sentBytes, receivedBytes);
}
private List<String> sendPostRequest(MultiValueMap<String, HttpEntity<?>> body) {
return WebClient.builder().build().post()
.uri("http://localhost:8080/files")
.contentType(MediaType.MULTIPART_FORM_DATA)
.body(BodyInserters.fromMultipartData(body))
.retrieve()
.bodyToMono(new ParameterizedTypeReference<List<String>>() {
})
.block();
}
private MultiValueMap<String, HttpEntity<?>> buildResources(int numberOfResources) {
MultipartBodyBuilder builder = new MultipartBodyBuilder();
for (int i = 0; i < numberOfResources; i++) {
builder.part("item-" + i, buildResource(i));
}
return builder.build();
}
private ByteArrayResource buildResource(int index) {
byte[] bytes = randomBytes();
sentBytes.add(Base64.getEncoder().encodeToString(bytes)); // keeps track of what has been sent
return new ByteArrayResource(bytes) {
#Override
public String getFilename() {
return "filename-" + index;
}
};
}
private byte[] randomBytes() {
byte[] bytes = new byte[ThreadLocalRandom.current().nextInt(16, 32)];
ThreadLocalRandom.current().nextBytes(bytes);
return bytes;
}
}
What could be the reason for this data corruption?
It turned out to be a bug in the Spring Framework (in the MultipartParser class to be more precise). I have created a GitHub issue which will be fixed in the next version (5.3.16). The bug is fixed by this commit.
Related
Trying to add some enhancements to this app,
private void parseCsv(CsvMapReader csvMapReader) throws IOException {
String[] header = csvMapReader.getHeader(true);
List<String> headers = Arrays.asList(header);
verifySourceColumn(headers);
verifyPovColumn(headers);
final CellProcessor[] processors = getProcessors(headers);
Map<String, Object> csvImportMap = null;
while ((csvImportMap = csvMapReader.read(header, processors)) != null) {
CsvImportDTO csvImportDto = new CsvImportDTO(csvImportMap);
if ( activationTypeP(csvImportDto) ){
AipRolloutVO aipRolloutVO = new AipRolloutVO(csvImportDto.getSource(),
csvImportDto.getPov(),
csvImportDto.getActivationType(),
csvImportDto.getActivationDate(),
csvImportDto.getDeactivationDate(),
csvImportDto.getMssValue());
aipRolloutRepository.updateAipRollout(aipRolloutVO.getDc(),
aipRolloutVO.getPov(),
aipRolloutVO.getActivationType(),
aipRolloutVO.getMssValue());
}
}
}
When it goes to the repo I get:
cannot find local variable 'csvImportMap'
5 times and then:
((CsvParserService)this).aipRolloutService = inconvertiible types; cannont cast 'org.spring.....
my controller method
#PostMapping(value = "/updatecsv", produces = MediaType.APPLICATION_JSON_VALUE)
#ResponseBody
public ResponseEntity<?> processCsv( #RequestParam("csvFile") MultipartFile csvFile) throws IOException {
if (csvFile.isEmpty()) return new ResponseEntity(
responceJson("please select a file!"),
HttpStatus.NO_CONTENT
);
csvParserService.parseCsvFile(csvFile);
return new ResponseEntity(
responceJson("Successfully uploaded - " + csvFile.getOriginalFilename()),
new HttpHeaders(),
HttpStatus.CREATED
);
}
repo im trying to reuse to update these values
public int updateAipRollout(String dc, String pov, String activationType, int mssValue) {
String query = some query
logger.debug("Rollout update query: " + query);
int num =jdbcTemplate.update(query);
return num;
}
Do I need to autowire the other service class that this repo is in and then call that service? But when I did that it also didn't fix the error....
I am developing prototype for a new project. The idea is to provide a Reactive Spring Boot microservice to bulk index documents in Elasticsearch. Elasticsearch provides a High Level Rest Client which provides an Async method to bulk process indexing requests. Async delivers callbacks using listeners are mentioned here. The callbacks receive index responses (per requests) in batches. I am trying to send this response back to the client as Flux. I have come up with something based on this blog post.
Controller
#RestController
public class AppController {
#SuppressWarnings("unchecked")
#RequestMapping(value = "/test3", method = RequestMethod.GET)
public Flux<String> index3() {
ElasticAdapter es = new ElasticAdapter();
JSONObject json = new JSONObject();
json.put("TestDoc", "Stack123");
Flux<String> fluxResponse = es.bulkIndex(json);
return fluxResponse;
}
ElasticAdapter
#Component
class ElasticAdapter {
String indexName = "test2";
private final RestHighLevelClient client;
private final ObjectMapper mapper;
private int processed = 1;
Flux<String> bulkIndex(JSONObject doc) {
return bulkIndexDoc(doc)
.doOnError(e -> System.out.print("Unable to index {}" + doc+ e));
}
private Flux<String> bulkIndexDoc(JSONObject doc) {
return Flux.create(sink -> {
try {
doBulkIndex(doc, bulkListenerToSink(sink));
} catch (JsonProcessingException e) {
sink.error(e);
}
});
}
private void doBulkIndex(JSONObject doc, BulkProcessor.Listener listener) throws JsonProcessingException {
System.out.println("Going to submit index request");
BiConsumer<BulkRequest, ActionListener<BulkResponse>> bulkConsumer =
(request, bulkListener) ->
client.bulkAsync(request, RequestOptions.DEFAULT, bulkListener);
BulkProcessor.Builder builder =
BulkProcessor.builder(bulkConsumer, listener);
builder.setBulkActions(10);
BulkProcessor bulkProcessor = builder.build();
// Submitting 5,000 index requests ( repeating same JSON)
for (int i = 0; i < 5000; i++) {
IndexRequest indexRequest = new IndexRequest(indexName, "person", i+1+"");
String json = doc.toJSONString();
indexRequest.source(json, XContentType.JSON);
bulkProcessor.add(indexRequest);
}
System.out.println("Submitted all docs
}
private BulkProcessor.Listener bulkListenerToSink(FluxSink<String> sink) {
return new BulkProcessor.Listener() {
#Override
public void beforeBulk(long executionId, BulkRequest request) {
}
#SuppressWarnings("unchecked")
#Override
public void afterBulk(long executionId, BulkRequest request, BulkResponse response) {
for (BulkItemResponse bulkItemResponse : response) {
JSONObject json = new JSONObject();
json.put("id", bulkItemResponse.getResponse().getId());
json.put("status", bulkItemResponse.getResponse().getResult
sink.next(json.toJSONString());
processed++;
}
if(processed >= 5000) {
sink.complete();
}
}
#Override
public void afterBulk(long executionId, BulkRequest request, Throwable failure) {
failure.printStackTrace();
sink.error(failure);
}
};
}
public ElasticAdapter() {
// Logic to initialize Elasticsearch Rest Client
}
}
I used FluxSink to create the Flux of Responses to send back to the Client. At this point, I have no idea whether this correct or not.
My expectation is that the calling client should receive the responses in batches of 10 ( because bulk processor processess it in batches of 10 - builder.setBulkActions(10); ). I tried to consume the endpoint using Spring Webflix Client. But unable to work it out. This is what I tried
WebClient
public class FluxClient {
public static void main(String[] args) {
WebClient client = WebClient.create("http://localhost:8080");
Flux<String> responseFlux = client.get()
.uri("/test3")
.retrieve()
.bodyToFlux(String.class);
responseFlux.subscribe(System.out::println);
}
}
Nothing is printing on console as I expected. I tried to use System.out.println(responseFlux.blockFirst());. It prints all the responses as a single batch at the end and not in batches at .
If my approach is correct, what is the correct way to consume it? For the solution in my mind, this client will reside is another Webapp.
Notes: My understanding of Reactor API is limited. The version of elasticsearch used is 6.8.
So made the following changes to your code.
In ElasticAdapter,
public Flux<Object> bulkIndex(JSONObject doc) {
return bulkIndexDoc(doc)
.subscribeOn(Schedulers.elastic(), true)
.doOnError(e -> System.out.print("Unable to index {}" + doc+ e));
}
Invoked subscribeOn(Scheduler, requestOnSeparateThread) on the Flux, Got to know about it from, https://github.com/spring-projects/spring-framework/issues/21507
In FluxClient,
Flux<String> responseFlux = client.get()
.uri("/test3")
.headers(httpHeaders -> {
httpHeaders.set("Accept", "text/event-stream");
})
.retrieve()
.bodyToFlux(String.class);
responseFlux.delayElements(Duration.ofSeconds(1)).subscribe(System.out::println);
Added "Accept" header as "text/event-stream" and delayed Flux elements.
With the above changes, was able to get the response in real time from the server.
How to upload multiple files using Webflux?
I send request with content type: multipart/form-data and body contains one part which value is a set of files.
To process single file I do it as follow:
Mono<MultiValueMap<String, Part> body = request.body(toMultipartData());
body.flatMap(map -> FilePart part = (FilePart) map.toSingleValueMap().get("file"));
But how to done it for multiple files?
PS. Is there another way to upload a set of files in webflux ?
I already found some solutions.
Let's suppose that we send an http POST request with an parameter files which contains our files.
Note responses are arbitrary
RestController with RequestPart
#PostMapping("/upload")
public Mono<String> process(#RequestPart("files") Flux<FilePart> filePartFlux) {
return filePartFlux.flatMap(it -> it.transferTo(Paths.get("/tmp/" + it.filename())))
.then(Mono.just("OK"));
}
RestController with ModelAttribute
#PostMapping("/upload-model")
public Mono<String> processModel(#ModelAttribute Model model) {
model.getFiles().forEach(it -> it.transferTo(Paths.get("/tmp/" + it.filename())));
return Mono.just("OK");
}
class Model {
private List<FilePart> files;
//getters and setters
}
Functional way with HandlerFunction
public Mono<ServerResponse> upload(ServerRequest request) {
Mono<String> then = request.multipartData().map(it -> it.get("files"))
.flatMapMany(Flux::fromIterable)
.cast(FilePart.class)
.flatMap(it -> it.transferTo(Paths.get("/tmp/" + it.filename())))
.then(Mono.just("OK"));
return ServerResponse.ok().body(then, String.class);
}
You can iterate hashmap with Flux and return Flux
Flux.fromIterable(hashMap.entrySet())
.map(o -> hashmap.get(o));
and it will be send as an array with filepart
the key is use toParts instead of toMultipartData, which is more simpler. Here is the example that works with RouterFunctions.
private Mono<ServerResponse> working2(final ServerRequest request) {
final Flux<Void> voidFlux = request.body(BodyExtractors.toParts())
.cast(FilePart.class)
.flatMap(filePart -> {
final String extension = FilenameUtils.getExtension(filePart.filename());
final String baseName = FilenameUtils.getBaseName(filePart.filename());
final String format = LocalDateTime.now().format(DateTimeFormatter.BASIC_ISO_DATE);
final Path path = Path.of("/tmp", String.format("%s-%s.%s", baseName, format, extension));
return filePart.transferTo(path);
});
return ServerResponse
.ok()
.contentType(APPLICATION_JSON_UTF8)
.body(voidFlux, Void.class);
}
希望对你有帮助
#PostMapping(value = "/upload", consumes = MediaType.MULTIPART_FORM_DATA_VALUE)
public JSON fileUpload(#RequestPart FilePart file)throws Exception{
OSS ossClient = new OSSClientBuilder().build(APPConfig.ENDPOINT, APPConfig.ALI_ACCESSKEYID, APPConfig.ALI_ACCESSSECRET);
File f = null;
String url;
try {
String suffix = file.filename();
String fileName = "images/" + file.filename();
Path path = Files.createTempFile("tempimg", suffix.substring(1, suffix.length()));
file.transferTo(path);
f = path.toFile();
ossClient.putObject(APPConfig.BUCKETNAME, fileName, new FileInputStream(f));
Date expiration = new Date(System.currentTimeMillis() + 3600L * 1000 * 24 * 365 * 10);
url = ossClient.generatePresignedUrl(APPConfig.BUCKETNAME, fileName, expiration).toString();
}finally {
f.delete();
ossClient.shutdown();
}
return JSONUtils.successResposeData(url);
}
Following is the working code for uploading multiple files using WebFlux.
#RequestMapping(value = "upload", method = RequestMethod.POST)
Mono<Object> upload(#RequestBody Flux<Part> parts) {
return parts.log().collectList().map(mparts -> {
return mparts.stream().map(mmp -> {
if (mmp instanceof FilePart) {
FilePart fp = (FilePart) mmp;
fp.transferTo(new File("c:/hello/"+fp.filename()));
} else {
// process the other non file parts
}
return mmp instanceof FilePart ? mmp.name() + ":" + ((FilePart) mmp).filename() : mmp.name();
}).collect(Collectors.joining(",", "[", "]"));
});
};
I'm trying for more than an hour to test this class. It went so ugly of stubbing the whole components of the method etc. I'd love some advice how to make a better test or refactor the class to make it way easier to test. I could not figure out a way yet.
Class to Test
#Slf4j
public final class HistoryRestService {
static RestTemplate restTemplate = new RestTemplate();
public static Optional<List<History>> findLatestHistories() {
String url = buildUrl();
ResponseEntity<History[]> responseEntity = null;
try {
responseEntity = restTemplate.getForEntity(url, History[].class);
} catch (ResourceAccessException e) {
log.warn("No connection to History persistence. Please check if the history persistence started up properly");
return Optional.empty();
}
History[] histories = responseEntity.getBody();
return Optional.of(Arrays.asList(histories));
}
private static String buildUrl() {
StringBuilder stringBuilder = new StringBuilder();
stringBuilder.append("http://");
stringBuilder.append("localhost");
stringBuilder.append(":8081");
stringBuilder.append("/history/get");
return stringBuilder.toString();
}
// For Testing
static void setRestTemplate(RestTemplate restTemplate) {
HistoryRestService.restTemplate = restTemplate;
}
}
Spock Test which fails
class HistoryRestServiceTest extends Specification {
def "test findLatestHistories"() {
given:
History mockedHistory = Mock()
HistoryRestService uut = new HistoryRestService()
History[] expected = [mockedHistory]
RestTemplate mockedRestTemplate = Stub()
ResponseEntity<History> mockedResponseEntity = Stub()
mockedResponseEntity.getBody() >> expected
mockedRestTemplate.getForEntity(_) >> mockedResponseEntity
uut.setRestTemplate(mockedRestTemplate)
when:
def actual = uut.findLatestHistories()
then:
actual.get() == expected
}
}
I'd suggest using real depedency-injection (spring/guice/cdi) instead of static variables.
Furthermore, you should think about what you want to test, is it the correct request and parsing of the network call, then write an integration test using something like mockserver or wiremock to have the whole stack. Or, if you are just concerned with the result handling, then you could move the code that interacts with RestTemplate into a separate method and use partial mocking to mock this method. I'd suggest to use the real integration test, but for the sake of an example this should work, but I didn't verify the code.
#Slf4j
public class HistoryRestService {
private final RestTemplate restTemplate;
public HistoryRestService() {
restTemplate = new RestTemplate();
}
public HistoryRestService(RestTemplate restTemplate) {
this.restTemplate = restTemplate;
}
public Optional<List<History>> findLatestHistories() {
try {
return Optional.of(Arrays.asList(getLatestHistories(buildUrl())));
} catch (ResourceAccessException e) {
log.warn("No connection to History persistence. Please check if the history persistence started up properly");
return Optional.empty();
}
}
History[] getLatestHistories(String url) throws {
ResponseEntity<History[]> responseEntity = null;
responseEntity = restTemplate.getForEntity(url, History[].class);
return responseEntity.getBody()
}
private String buildUrl() {
StringBuilder stringBuilder = new StringBuilder();
stringBuilder.append("http://");
stringBuilder.append("localhost");
stringBuilder.append(":8081");
stringBuilder.append("/history/get");
return stringBuilder.toString();
}
}
class HistoryRestServiceTest extends Specification {
#Subject
HistoryRestService uut = Spy()
def "test findLatestHistories"() {
given:
History[] expected = [mockedHistory]
when:
def actual = uut.findLatestHistories()
then:
actual.get() == expected
1 * uut.getLatestHistories(_ as String) >> expected
}
def "test findLatestHistories returns empty on exceptions"() {
given:
History[] expected = [mockedHistory]
when:
def actual = uut.findLatestHistories()
then:
!actual.present
1 * uut.getLatestHistories(_ as String) >> {throw new ResourceAccessException()}
}
}
I am following this tutorial on uploading files to a server from android, but I cannot seem to get the code right on the server side. Can somebody please help me code the Web Api post method that would work with that android java uploader? My current web api controller class looks like this:
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using System.Web;
using System.Web.Http;
namespace WSISWebService.Controllers
{
public class FilesController : ApiController
{
// GET api/files
public IEnumerable<string> Get()
{
return new string[] { "value1", "value2" };
}
// GET api/files/5
public string Get(int id)
{
return "value";
}
// POST api/files
public string Post([FromBody]string value)
{
var task = this.Request.Content.ReadAsStreamAsync();
task.Wait();
Stream requestStream = task.Result;
try
{
Stream fileStream = File.Create(HttpContext.Current.Server.MapPath("~/" + value));
requestStream.CopyTo(fileStream);
fileStream.Close();
requestStream.Close();
}
catch (IOException)
{
// throw new HttpResponseException("A generic error occured. Please try again later.", HttpStatusCode.InternalServerError);
}
HttpResponseMessage response = new HttpResponseMessage();
response.StatusCode = HttpStatusCode.Created;
return response.ToString();
}
// PUT api/files/5
public void Put(int id, [FromBody]string value)
{
}
// DELETE api/files/5
public void Delete(int id)
{
}
}
}
I am pretty desperate to get this working as the deadline is tuesday. If anybody could help that would be much appreciated.
you can post a files as multipart/form-data
// POST api/files
public async Task<HttpResponseMessage> Post()
{
// Check if the request contains multipart/form-data.
if (!Request.Content.IsMimeMultipartContent())
{
throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);
}
string root = HttpContext.Current.Server.MapPath("~/App_Data");
var provider = new MultipartFormDataStreamProvider(root);
string value;
try
{
// Read the form data and return an async data.
var result = await Request.Content.ReadAsMultipartAsync(provider);
// This illustrates how to get the form data.
foreach (var key in provider.FormData.AllKeys)
{
foreach (var val in provider.FormData.GetValues(key))
{
// return multiple value from FormData
if (key == "value")
value = val;
}
}
if (result.FileData.Any())
{
// This illustrates how to get the file names for uploaded files.
foreach (var file in result.FileData)
{
FileInfo fileInfo = new FileInfo(file.LocalFileName);
if (fileInfo.Exists)
{
//do somthing with file
}
}
}
HttpResponseMessage response = Request.CreateResponse(HttpStatusCode.Created, value);
response.Headers.Location = new Uri(Url.Link("DefaultApi", new { id = files.Id }));
return response;
}
catch (System.Exception e)
{
return Request.CreateErrorResponse(HttpStatusCode.InternalServerError, e);
}
}