<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 TRANSITIONAL//EN">
<HTML>
<HEAD>
<META HTTP-EQUIV="Content-Type" CONTENT="text/html; CHARSET=UTF-8">
<META NAME="GENERATOR" CONTENT="GtkHTML/3.28.1">
</HEAD>
<BODY>
On Sat, 2009-11-28 at 11:41 -0800, Ray Parrish wrote:
<BLOCKQUOTE TYPE=CITE>
<PRE>
Hello,
I have been trying to figure out why ls will not return folder names
only as per the man page's insistence that it will with the -d switch.
Here is output from Terminal
ray@RaysComputer:~/links$ ls --directory
.
ray@RaysComputer:~/links$
No sub folders returned from the ls command. Yet when I run the
following, you can see the first few lines are all folders that clearly
exist in that Directory.
ray@RaysComputer:~/links$ ls -a --classify --group-directories-first
./
../
Desktop/
GSD/
Images/
scripts/
Video/
481445.png
AddURLtoGoogleSitemapScript.html
AddURLtoGoogleSitemapScript.html~
AddURLtoSitemap.sh.zip
What I'm trying to get ultimately is a recursive list of all files of a
certain type in all sub folders of the current folder, including the
current folder. But when I add the --recursive command, and include a
*.html on the end, all I get is a list of the .html files in the current
folder, and none from any of the sub folders. Like so -
ray@RaysComputer:~/links$ ls -x -1 --recursive *.html
AddURLtoGoogleSitemapScript.html
Biology.html
bookmarks.html
BootingintoSafeMode.html
buttonbar.html
Change Skin.html
CharitableOrganizations.html
Combatting Spam.html
Comparison of Windows and Ubuntu Linux.html
Computer.html
Computer Tips and Tutorials.html
Email Information.html
EnvironmentalIssues.html
error404.html
EtherApeNetworkSecurityTool.html
FomentingResponsibleLeadership.html
google34436ce73104f9d5.html
GUIWebSiteMonitor.html
GUIWrapperforWebChangeMonitorScript.html
HardDiskRequirementsXPServicePack3.html
Hard Drives.html
HowToInstallSoftwareinUbuntu.html
How To Install Ubuntu.html
How to speed up and organize tab switching in Firefox.html
index.html
InformationAboutWindowsXPServicePack3.html
InsuringResponsibleLeadership.html
Internet.html
Internet Radio Stations.html
Internet Speed Test.html
Is The Eiffel Tower Taller than the Space Needle.html
Learning Linux References.html
Linux distributions and documents.html
LinuxMonitorWebPageforChangeScript.html
MissingTitlebarsinUbuntu.html
NoMorePaidLobbyists.html
Offline Browsing Tools.html
OnlineMagazines.html
PasswordLoadingScript.html
PayingofftheNationalDeficit.html
PossibleProblemsXPServicePack3.html
PostingRules.html
Programming.html
ReleasingHungModem.html
ScienceandTechnology.html
Security Measures.html
Skinning Web Sites.html
StimulusInvestigation.html
Submit Link.html
Survey Sites.html
Template.html
ThankYou.html
The Future of Technology.html
TimerStopWatchScriptforLinux.html
Troubleshooting and fixing Windows.html
Troubleshooting HiJackThis Log Files.html
Troubleshootings Tools for Windows.html
Video.html
Visual Basic Tips.html
Web Browsers.html
Web Editors.html
Web Hosting.html
Web Publishing.html
Web Site Monitor.html
WhatisHardDiskFragmentation.html
Whatisrecursion.html
WhereAreProgramFilesinUbuntu.html
Where to go to get help.html
Why Dual Boot.html
Windows XP SP3 installation.html
ray@RaysComputer:~/links$
Note that none of that output came from any of the sub folders, so
--recursive is not staying the course. I can get a recursive listing,
but it includes all file types, and the sub folder names in it's output.
Is there any way to get just the sub folder names? Is there any way to
get a recursive listing of all of only one file type, with their
relative path to the current folder included on each line?
Thanks for any help you can be, Ray Parrish
--
The Future of Technology.
<A HREF="http://www.rayslinks.com/The%20Future%20of%20Technology.html">http://www.rayslinks.com/The%20Future%20of%20Technology.html</A>
Ray's Links, a variety of links to usefull things, and articles by Ray.
<A HREF="http://www.rayslinks.com">http://www.rayslinks.com</A>
Writings of "The" Schizophrenic, what it's like to be a schizo, and other
things, including my poetry.
<A HREF="http://www.writingsoftheschizophrenic.com">http://www.writingsoftheschizophrenic.com</A>
</PRE>
</BLOCKQUOTE>
Sounds like a job for <B><I>find</I></B> and <B><I>grep</I></B>
</BODY>
</HTML>