Confused over CIFS
Ted Hilts
thilts at mcsnet.ca
Wed Jan 14 04:36:22 UTC 2009
Preston Kutzner wrote:
>
> On Jan 12, 2009, at 11:01 PM, Ted Hilts wrote:
>
>> I've read everything you guys have posted - thanks. I think there are
>> issues you have not addressed that were relevant to my particular
>> situation.
>>
>> 1. There are two machines, one is called "misty" (XP Home with NTFS
>> formatted disks) and is the FROM machine for the data transfer. The
>> second machine is called "Ubuntu" (Linux Ubuntu 8.4 upgraded from 7.10
>> with it's own partition and dual booted with another XP Home machine).
>> Note that the 7 hard drives on the Ubuntu machine are NTFS formatted
>> disks with one of these partitioned to create the Ubuntu system which is
>> all I run on this machine. Ubuntu has direct access to all these NTFS
>> formatted drives as part of it's local system. Ubuntu is the TO machine
>> for the data transfer and the data will be placed on one those NTFS
>> formatted drives. So the data transfer is moving from "misty" (XP Home
>> NTFS drive) to "Ubuntu" (Ubuntu 8.4 NTFS drive).
>
> First, just out of curiosity, how much data are you trying to
> transfer? Second, is this data a bunch of small files or a handful of
> very large files? This can make a difference in file copy times.
> What type of network are these computers connected to?
>
>>
>>
>> 2. I used traditional "smbmount" not "mount.cifs" although it seems that
>> it was CIFS that was engaged in the data transfer when I looked at the
>> mounts. The data transfer using "cp" worked very well and fast but went
>> almost dead after processing a lot of data so I tried to bring the
>> transfer to an end with "Ubuntu" responding that I should use
>> "umount.cifs" which seemed to fail but thinking back was probably just
>> taking a lot of time to stop. Why the data flow went very slow could be
>> for a number of reasons. For example it may not have been a failure on
>> the part of "cp" or CIFS and then maybe it was cp or CIFS. But I often
>> get glitches (I think from my IPS) which have a similar effect on
>> running processes in that they almost stop or appear to stop and after a
>> while they are running again and in some cases they have to be
>> restarted. The data transfer I was running would have taken a long time
>> because when I ran it I knew literally nothing about rsync and I was
>> using plain old "cp".
>
> rsync might be a better option for you. It is proven technology that
> has been in use for many years. Also, as mentioned by others here, it
> does have good compression as well.
>>
>>
>> 3. Does the FROM and TO filelsystem (both being NTFS) have anything to
>> do with this data transfer?
>
> This is possible. The ability to write to NTFS partitions under Linux
> is a fairly new development. While it is obviously deemed ready for
> production use, I'm still not sure it's up to par compared with other
> filesystems. That's only my take on it, though. I'm curious as to
> why you're using NTFS for local disks under Linux. Is there a
> specific reason you're using it? You would be better off using a
> Linux native filesystem such as ext3 (or the upcoming ext4) or xfs.
> They're much better at preventing file fragmentation and also support
> journals. XFS is very good at handling large files. As I understand
> it the ntfs-3g driver was reverse-engineered and therefore might not
> be as optimized as a native ntfs driver. Again, if you have a
> specific reason to be using NTFS drives on a single-OS linux install,
> disregard. But if not, I would suggest reformatting the drives to a
> linux-native file format. ntfs-3g *might* possibly be your bottleneck.
>>
>>
>> 4. I have been looking at using backuppc but the man page mentions
>> numerous tasks in setting it up. The man page does seem to indicate that
>> XP Pro is required and says nothing about XP Home. I should never ever
>> got involved with HP Home machines as they are nothing but a big head
>> ache when it comes to networking with Linux machines.
>
> I don't think that's HP specific, but rather a Windows problem.
> Windows doesn't like to work well with anything not Windows. But yes,
> HP consumer-end machines tend to be problematic.
>
> If you can spare one of the 7 drives in your machine, and it's large
> enough, I would try reformatting it ext3 and trying your file copy
> again. It would, at the very least, rule out ntfs-3g as reason for
> the file copy hanging.
>
>>
>>
>> Thanks -- Ted
>>
>>
>> --
>> ubuntu-users mailing list
>> ubuntu-users at lists.ubuntu.com
>> Modify settings or unsubscribe at:
>> https://lists.ubuntu.com/mailman/listinfo/ubuntu-users
>
Preston:
I rely heavily on SAMBA and have most of the machines in my LAN cross
mapped with shares to each other. I have Linux (various flavors) and
Windows (various flavors). Sorry for the typo. I was not complaining
about HP machines but rather I was complaining about MS XP Home. The
machine I call "Ubuntu" is used to administrate and I use it to make
virtual (VNC) connectionss. For example, I am as I write this using an
XP Home machine called "cic2ext" and am communicating with it with
"Ubuntu's" keyboard and monitor. Also, at the same time I am running 6
machines. None of my machines are very fast except for "Ubuntu" and
"misty". Depending upon the particuar project I need a number of
machines to handle the data input requirements. For example, one
project is to gather and archive news reports from all over the world
and from a number of differenct agencies in order to establish the
integrity (a lot of bias gets into news reports both intentionally and
unintentionally). One of my machines cannot handle this task by itself
and the overall process requires my intervention (which is time
consuming). So, I end up running several machines all doing different
parts of the overall news reporting task. For this particular project I
have to set up each machine to run the same applications and then
interact manually at various stages of their progress. I used to build
programs to automate this overall process but ended up constantly
revising the programs and spending more time building the programs than
collecting the news. Every time a news agency changed their web pages I
had to revisit my automated programs which involved my intervention any
way. So I ended up using the Firefox browser with various add-ons such
as Newsfox to gather RSS articles and also an add-on called scrapbook
which captures a tree or tabbed news article. The machine "misty"
handles about 300 different news sources. Another machines called
"CICERO" handles about 50 different news sources, the machine called
"Ubuntu" handles about 100, and so on. Most of the machines are XP and
they archive onto the NTFS disk share (/media/sdg1) located on the
"Ubuntu" local system. If this reliance on XP machines was not
necessary then there would be no problem using ext3. I realize that XP
machines can access ext3 but when I researched the idea I was not
comfortable because of the number of XP machines involved. Remember,
"Ubuntu" is dual booted with a MS XP Home OS. I installed "Ubuntu" as a
experiment and fell in love with it. So that is the story. Right now
there is no problem other than the one I reported. I'm a little
nervous about rsync because the files and numberous and small -- even
the tree files. I don't want rsync doing anything fancy such as one of
it's features where it adds to an exiting file that is changed.
Probably this is not going to be a problem if rsync does it's thing and
after that does not alter a file. I like the idea of compression to
make the data transfer faster.
What would the command line look like if I used rsync instead of what I
did do? I think if I use rsync I may have to install SHH for Windows on
all the XP Home machines as well as rsync for Windows. I have been
looking at "putty" for that purpose. Also, I think that configuration
files need to be created? If so, I would appreciate some examples.
After this particular data transfer runs properly then I need to create
a universal (Windows and Linux) backup so I can restore a partition or
entire disk. I have been researching all this but it is very time
consuming and it might be easier to create a shell and build into the
shell each piece needed to handle the various data transfers, and
various backup scenarios. I could maybe do this one piece at a time
instead of the whole thing in one shot. For example, to use backuppc one
has to involve perl, set environmental variables and set up
configuration files. Then the same for SHH. Then the same for rsync.
And other things.
Thanks, Ted
More information about the ubuntu-users
mailing list