Reading data from xml file and load into datatable. i can not load data into List<T> because datatable has some dynamic column whose name is not known at compile time.
I have one List<T> and one datatable. _AllCommentsData is List<T> and dtData has datatable.
More or less _AllCommentsData has 10,000 to 20,000 data and dtData datatable has high volume of data. approx some dtData datatable has 70,00,000 data means 7 million data. when traversing in high volume of data and update one column value is taking long time. so guide me how to refactor or restructure the code as a result update one column value with high volume of data would take very less time. any work around exist to deal this situation?
i could join my datatable and List together and update field value but i could not do it because few column name is dynamic in datatable. so in LINQ select i can not mention their name and that is the reason i query datatable in loop and update data.
can i use Parallel.For() or AsParallel ? please guide me with some sample code which i can use to speed the data updation when there is high volume of data in data table. Thanks
My Sample code
foreach (var item in _AllCommentsData.Where(a => a.CommentType == "Only LI"))
{
if(item.Comments!="")
{
tmpData = dtData.AsEnumerable().Where(a => a.Field<string>("Section") == item.section
&& a.Field<string>("LineItem") == item.LineItem
&& (a.Field<int?>("EarningID") == null ? 0 : a.Field<int>("EarningID")) == item.EarningID);
if (tmpData != null && tmpData.Count() > 0)
{
tmpData.ForEach(x =>
{
x["LI_Comment"] = 1;
});
}
}
}