Group: ipernity API development
doc.search limit
![]() |
I don't understand this 3000 items limit.
If what you want is a limit to the quantity of calls, a 'per hour' calls limit as in Flickr seems to me more effective.
As it is implemented now, I can always play with the parameters and circumvent it: the result is that the code is more difficult to write, but the server load only slightly decreases...
If what you want is a limit to the quantity of calls, a 'per hour' calls limit as in Flickr seems to me more effective.
As it is implemented now, I can always play with the parameters and circumvent it: the result is that the code is more difficult to write, but the server load only slightly decreases...
You must be a member of this group to reply to this topic. (Join?)
Jump to top
RSS feed- Latest comments - Subscribe to the comment feed for this topic
- ipernity © 2007-2025
- Help & Contact
|
Club news
|
About ipernity
|
History |
ipernity Club & Prices |
Guide of good conduct
Donate | Group guidelines | Privacy policy | Terms of use | Statutes | In memoria -
Facebook
Twitter
--
Coming from a group home page (?)
Roberto Ballerini -… has replied to DirkBut when you intrinsically haver to do with great quantities of data, caching doesn't help (combined with the 100 items per page limit which forces you to do more roundtrips to obtain the same quantity of data). Making analysis on a single day worth of shots, I already seen the data changing while I was looping... Perhaps the solution is in caching/proxying on the server instead of on the client side of the XML-RPC interaction.
--
Coming from a group home page (?)
Christophe Ruelle clubThe API calls limits are much higher (100K request/day or so).
Roberto Ballerini -… has replied to Christophe Ruelle clubSome example:
- a month worth of public uploads: 150.000 shots
- Ojisanjake stream: 12000+ shots
- bigger groups: 10000+ shots
These are three situations where we'd have to do an extra external loop, changing some parameters, to go through it.
If you want to make a bot to help admins to manage group pools, if you want to sum the total visits on Jake's stream, if you want to find the more viewed shots of September, ... all examples of situations where this 3000 items limit add complexity to code.
So if the reason for the limit is to reduce server load, well, I can understand it, but I think it isn't effective: I can code around it: it's better to introduce a limit of calls per hour to force us to streamline the code (but possibly this testing phase isn't the right moment for it...).
You have to find a balance between server costs and quality of service for developers: I think that the quantity of Flickr mashups available is one of the greatest reasons for their success --> give the developers the possibility to grow your success ;-)
--
Coming from a group home page (?)