nanogui: SERVER_LOCK() question


Previous by date: 7 Apr 2004 16:27:14 +0100 flash player, Oscar
Next by date: 7 Apr 2004 16:27:14 +0100 Re: SERVER_LOCK() question, Aaron J. Grier
Previous in thread:
Next in thread: 7 Apr 2004 16:27:14 +0100 Re: SERVER_LOCK() question, Aaron J. Grier

Subject: SERVER_LOCK() question
From: Mark Mussetter ####@####.####
Date: 7 Apr 2004 16:27:14 +0100
Message-Id: <5.1.0.14.0.20040407092014.023b0e20@link-comm.com>

I have a quick question about the SERVER_LOCK() macro that gets defined in 
serv.h.  The code looks like the following:

============
#if NONETWORK
/* Use a server-side mutex. */
#include "lock.h"

LOCK_EXTERN(gr_server_mutex);

#define SERVER_LOCK_DECLARE   LOCK_DECLARE(gr_server_mutex);
#define SERVER_LOCK_INIT()    LOCK_INIT(&gr_server_mutex)
#define SERVER_LOCK()         LOCK(&gr_server_mutex)
#define SERVER_UNLOCK()       UNLOCK(&gr_server_mutex)

#else /* !NONETWORK */
/* The Nano-X server is single threaded, so disable the server-side mutex 
(for speed). */

#define SERVER_LOCK_DECLARE /* no-op */
#define SERVER_LOCK_INIT()  do {} while(0) /* no-op, but require a ";" */
#define SERVER_LOCK()       do {} while(0) /* no-op, but require a ";" */
#define SERVER_UNLOCK()     do {} while(0) /* no-op, but require a ";" */

#endif /* !NONETWORK*/
============

To me it looks like SERVER_LOCK() IS getting defined if there is NO NETWORK 
and it is NOT getting defined if there IS A NETWORK.  Why would we need 
SERVER_LOCK() if there is no network?  Is this backwards?

Thanks,

Mark



Previous by date: 7 Apr 2004 16:27:14 +0100 flash player, Oscar
Next by date: 7 Apr 2004 16:27:14 +0100 Re: SERVER_LOCK() question, Aaron J. Grier
Previous in thread:
Next in thread: 7 Apr 2004 16:27:14 +0100 Re: SERVER_LOCK() question, Aaron J. Grier


Powered by ezmlm-browse 0.20.