Got them both, as well as CVS.  But that would mean that we'll be building on both sites.  We have millions of lines of code.  A build can take hours, and we don't have resources to do that on both sides.

Thanks for trying.

 --- 
Wayne Johnson,                         | There are two kinds of people: Those 
3943 Penn Ave. N.          | who say to God, "Thy will be done," 
Minneapolis, MN 55412-1908 | and those to whom God says, "All right, 
(612) 522-7003                         | then,  have it your way." --C.S. Lewis





________________________________
From: gm5729 <gm5729 at gmail.com>
To: TCLUG Mailing List <tclug-list at mn-linux.org>
Sent: Fri, March 12, 2010 10:51:45 AM
Subject: Re: [tclug-list] GFS over a WAN

Sounds like you need GIT or SVN for revision control, and
accountability/trackability of changes.




On Fri, Mar 12, 2010 at 10:29, Wayne Johnson <wdtj at yahoo.com> wrote:
> Our company was just acquired by one in TX.  We are now in the position of
> having to keep a local and remote linux (Centos54) file servers synced up.
> We have an automated build process that spits out about a 4GB build every
> night or even more often at times.  This is now being done on the corporate
> network.  We have local test facilities that use these builds, as well as
> corporate facilities that need access to our local resources.  It's all
> getting to be pretty messy.
>
> We are trying to keep the file servers synced up using rsync.  Unfortunately
> sometime it takes a while for this huge amount of data to be synced up, or
> worse, someone makes a change to the local server's files, which gets
> overwritten on the next rsync.
>
> I was wondering if there was a way to create a common file system between
> both sites (here in MN and TX).  GFS sounds like it might work, but I have
> not found anyone who claims to have done this on Google.  I remember there
> use to be AFS which worked in a similar fashion.
>
> Guess I'm hoping to set something up where files will exist on both
> networks.  When a file is opened, the network compares the local and remote
> file systems and the newest version is used.  If the remote is the newest,
> it's transferred to the local as it is used so the local cache is updated
> and the next use will be entirely local.  Am I dreaming?
>
> Anyone have any ideas?
>
> Thanks.
>
> ---
> Wayne Johnson,             | There are two kinds of people: Those
> 3943 Penn Ave. N.          | who say to God, "Thy will be done,"
> Minneapolis, MN 55412-1908 | and those to whom God says, "All right,
> (612) 522-7003             | then, have it your way." --C.S. Lewis
>
>
>
> _______________________________________________
> TCLUG Mailing List - Minneapolis/St. Paul, Minnesota
> tclug-list at mn-linux.org
> http://mailman.mn-linux.org/mailman/listinfo/tclug-list
>
>



-- 
-- 
If there is a question to the validity of this email please phone for
validation. Proudly presented by Mutt, GNUPG, Vi/m and GNU/Linux via
CopyLeft. GNU/Linux is about Freedom to compute as you want and need
to, and share your work unencumbered and have others do the same with
you. Key :  0xD53A8E1

_______________________________________________
TCLUG Mailing List - Minneapolis/St. Paul, Minnesota
tclug-list at mn-linux.org
http://mailman.mn-linux.org/mailman/listinfo/tclug-list



      
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.mn-linux.org/pipermail/tclug-list/attachments/20100312/d32df701/attachment.htm