[chuck-users] osc listener shred woes

Stefan Blixt stefan.blixt at gmail.com
Mon Apr 28 12:10:26 EDT 2008


Actually, come to think of it, Mike, why have the Bang object at all? Why
not just move the code (the if statements) from the main loop into the
oscListener loops/shreds, and replace the broadcast with actual
functionality instead? Maybe there is some more advanced stuff that you
haven't included that prevents this?

/Stefan

On Mon, Apr 28, 2008 at 6:00 PM, Stefan Blixt <stefan.blixt at gmail.com>
wrote:

> Yeah, but note that there are two shreds running this loop in parallell.
> So if one event on each port arrive at the same time, chances are that the
> first yield will hand over execution to the second shred, that in turn
> overwrites b.value and b.message. I think you need at least two Bang
> instances to be sure that this doesn't happen.
>
> /Stefan
>
> On Mon, Apr 28, 2008 at 5:10 PM, Kassen <signal.automatique at gmail.com>
> wrote:
>
> > 2008/4/28 Stefan Blixt <stefan.blixt at gmail.com>:
> >
> > > I don't know about the segv signal, but it seems to me that there is
> > > only one Bang instance that is shared by all iterations/shreds. This means
> > > that if two events arrive at this loop:
> > >
> > >        while( oe.nextMsg() ) {
> > >            oe.getInt() => b.value;
> > >            osctype => b.message;
> > >            b.broadcast();
> > >        }
> > >
> > > the second's values will overwrite those of the first (value and
> > > message from the first event will be lost).
> >
> >
> > I think that's right and I think the way around this would be
> >
> >        while( oe.nextMsg() ) {
> >            oe.getInt() => b.value;
> >            osctype => b.message;
> >            b.broadcast();
> >            //yield to receiving shreds,
> >            //then continue processing the message cue
> >            me.yield();
> >        }
> >
> > This is the exact same principle I talked about earlier in this topic;
> > if you don't yield you don't give the waiting shreds a chance to run.
> > Advancing time here would probably not be a good idea.
> >
> > Yours,
> > Kas.
> >
> > _______________________________________________
> > chuck-users mailing list
> > chuck-users at lists.cs.princeton.edu
> > https://lists.cs.princeton.edu/mailman/listinfo/chuck-users
> >
> >
>
>
> --
> Release me, insect, or I will destroy the Cosmos!
>



-- 
Release me, insect, or I will destroy the Cosmos!
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.cs.princeton.edu/pipermail/chuck-users/attachments/20080428/55db8efd/attachment.html>


More information about the chuck-users mailing list