I had to catch the errors that occur during XML serialization (mostly due to invalid ASCII control codes), so I went on and replaced the controller endpoint as below.
Found the solution on how to do that here - http://stackoverflow.com/questions/16552715/how-do-i-trap-a-serializationexception-in-web-api/16554959#16554959
[DataContract(Name="ActivityData")] [XmlRoot("Activity")] public class ActivityData { [DataMember] public string Id { get; set; } [DataMember] public ActivityType type {get; set; } ....other Data members }
**Before:**
[HttpGet] public ActivityData GetActivityData(string startYear, string endYear) { ActivityData activityData = Repository.GetActivityData(startYear, endYear); return activityData; }
**After:**
[HttpGet] public HttpReponseMessage GetActivityData(string startYear, string endYear) { try{ var responseContent = Repository.GetActivityData(startYear, endYear); var response = Request.CreateReponse(HttpStatusCode.OK, responseContent); //Below code ensures that any errors during serialization is caught. //But does it affect performance? response.Content.LoadIntoBufferAsync().Wait(); return response; } catch(Exception ex){ logger.error(ex); return Request.CreateResponse(HttpStatusCode.InternalServerError, ex); } }
My question is that, will this have any imapct on the API's performance?
Does returning the object directly continuously 'stream' the data as it gets serialized and by Loading it into buffer, would the API serialize the entire object and then return it all together? or is just the same behaviour on both occasions?
I noticed the memory consumption go up when the code was hitting the return statement on both occasions. I would just like to know for sure that there wouldn't be any performance issues because of this change.
The returned XML document can be anywhere between a few MBs to a few hundred MBs of payload with number of rows ranging between hundreds and up to amillion rows. (depending on the start and end date), so is there a limit to the buffersize?
Thanks in advance.