1

I've a large amount of real time data need to be proceed as fast as possible.

These data are coming from multi network connection threads.

All network threads are passing the data to a shared function to proceed some translation and interpretation to the data revived, after that it saves the information into Concurrent Dictionary as object by object.

The problem is I have an amount of objects might reach 150K stored on this dictionary. What happens is while fetching the object to update it takes much time rather than the accepted time.

public class MyObject
{  
  System.Timers.Timer LostTimer = new System.Timers.Timer();
  public int ID;
  public DateTime UpdateTime;

  public  MyObject()
  {
    LostTimer.Interval = 20000;
    LostTimer.Elapsed+=TimerElapsedHandler(LostTimer_Elapsed);
    LostTimer.Enabled = true;
  }

  void LostTimer_Elapsed(object sender,EventArgs e)
  {
    if(UpdateTime > DateTime.Now.AddSeconds(-20))
         Console.WriteLine(ID + " Lost...");
  }

}

public class MyClass
{
  public MyClass(){}

  private ConcurrentDictionary<int,MyObject> Objects = new ConcurrentDictionary<int,MyObject>();

  void NetworkThread1DataRecived(eventArgs e)
  {
    Translate(e.Data);
  }
  void Translate(string[] data)
  {
   Task.Factory.StartNew(()=>
   {
      Parallel.ForEach(data, s (()=>
      {
         MyObject o = null;
         Objects.TryGet(int.Parse(s),out o)
         if(o == null)
         {       
             o = new MyObject();
             o.ID = int.Parse(s);
             o.UpdateTime = DateTime.Now;

             Objects.TryAdd(s,o);
         }
         else
         {
            o.UpdateTime = DateTime.Now;
         }
      });
   });
 }
}

Now while working with more than 30K of objects it gives me objects lost.

The logic is that I'm subtracting the object grace period from the current system time and compare it with the last update time for that object. Do you think that this type of thread safe array (Dictionary) can not handle this large amount of data and causes read./write access delays which causes object lost?

Before I were using List<MyObject> and Lock(Object){} to handle the multi thread access to this shared memory, but it fails after 10K of objects. After changing it to dictionary (.Net build in thread safe list) it works well with 30K.

My target is to reach 150K, can I reach it with this logic?

3
  • Your code won't compile - UpdateTime > DateTime.Now.AddSeconds(-20000) is invalid as the LHS is an int and the RHS is a DateTime. If we can't see some real code, it'll be hard to diagnose the problem. Likewise your call to Parallel.ForEach doesn't look right, syntactically... Commented Nov 29, 2010 at 8:29
  • I corrected it well, sorry it was a publish mistake, the code has logic error rather than it is all well compiled syntactically. Thanks Commented Nov 29, 2010 at 9:35
  • The logic is that i'm subtracting the object grace period from the current system time and compare it with the last update time for that object. Do you think that this type of thread safe array (Dictionary) can not handle this large amount of data and causes read./write access delays which causes object lost??? before i were using List and Lock(Object){} to handle the multi thread access to this shared memory, but it fails after 10K of objects. after changing it to dictionary (.Net build in thread safe list) it works well with 30K. my target is to reach 150K Commented Nov 29, 2010 at 11:10

1 Answer 1

3

So, for each object you add (30K objects) you create a timer. Thats 30.000 timers active.

I think this is creating a lot of overhead.

If this is just for logging/audit, you should do this in 1 timer. Possibly create a seperate list/dictionary of objects that you want to log.

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks, I'll try to remove each object timer and calculate lost by global timer instead of old one. By the way all objects must be online within a certain grace period, Please keep in touch...

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.