Huge WFS inserts fail 3+ meg POST

Description

I have been evaluating WFSTs for possible inclusion in our project, and have been working with geoserver's wfst to produce a demo showing what a web feature server can do and how it can help us move features around. I've run into a problem POSTing large amounts of data. When I try to insert an 11+ meg feature, I get an unhandled exception, something on the order of "Java POST exceeds limit." I was wondering if anyone besides me has come across this error, and whether it is the result of my naiveté, or an actual problem? How are other people handling this? My only solution so far has been to use an xslt to transform the xml into plain postgres INSERT INTOs, but that will not work long term for us.

Basically I am trying to insert that looks like:

<wfs:Transaction version="1.0.0" service="WFS">
<wfs:Insert>
<DC:Location></DC:Location>
</wfs:Insert>
<DC:Location></DC:Location>
<wfs:Insert>
<DC:Location></DC:Location>
</wfs:Insert>
<DC:Location></DC:Location>
<wfs:Insert>
<DC:Location></DC:Location>
</wfs:Insert>
...
<wfs:Insert> (this is the nTH insert. Some of my pseudo complex features have 10k inserts)
<DC:Location></DC:Location>
</wfs:Insert>

</wfs:Transaction:

I tried to insert 5522 features via one post, and got 8692 entries for those features in catalina.out. Somewhere on the insert it fails.

The Setup:

SERVERowerbook 667

geoserver

postGIS --simple featuretype—e.g one table

CLIENT: PC (though I also put geoserver on the pc to see if that would make a different)

IE on pc for inserting small features

for large posts I wrote a little client app that loads the mega features from text and posts it.

Thanks,

John Stiening

Environment

None

Activity

Show:
codehaus
April 10, 2015, 4:48 PM

CodeHaus Comment From: aaime - Time: Fri, 4 Dec 2009 04:06:15 -0600
---------------------
<p>It is my impression that the current code will have troubles parsing a huge insert as well.

Justin? Is it working fully in memory or streaming?</p>

codehaus
April 10, 2015, 4:48 PM

CodeHaus Comment From: jdeolive - Time: Mon, 14 Dec 2009 12:53:46 -0600
---------------------
<p>Yup, all the features get parsed into memory. To do streaming on inserts would be hard. For one it would mean that we would have to temporarily save the request out somewhere to intermediate location since an entire request must be parsed before it is dispatched. Then we would have to come up with a feature collection that stream parses the saved out features. All in all probably something that won't be fixed without some funding.</p>

codehaus
April 10, 2015, 4:48 PM

CodeHaus Comment From: aaime - Time: Sat, 28 Jan 2012 14:12:40 -0600
---------------------
<p>I see. However, for the sake of this bug report, I guess we can accept a transaction that's 3MB, provided the container allows such a big POST (which they won't by default). Something we have to try it, if it works I'd say to go ahead and close it.</p>

codehaus
April 10, 2015, 4:48 PM

CodeHaus Comment From: aaime - Time: Sat, 17 Aug 2013 09:04:15 -0500
---------------------
<p>Just tried 2.4.x with a 4MB insert request, could not replicate, so closing. Sample request attached (compressed, it's around 8000 times the same feature, so it compresses really well)</p>

Andrea Aime
February 15, 2017, 11:52 AM

Mass closing all resolved issues not modified in the last 4 weeks

Assignee

Unassigned

Reporter

codehaus

Triage

None

Fix versions

None

Affects versions

Components

Priority

High
Configure