This course will focus on the Linux command line and the text-manipulation tools that let you effectively control just about anything on your system. We'll learn about terminal environments, working with text streams, file management and archives, system processes, advanced text searches, and terminal text editors.
If you have thoughts or suggestions for this course, please contact Cloud Academy at firstname.lastname@example.org.
In the previous video, we discussed file management how you can control and manage your files directly but also indirectly through piping file names and attributes between commands. But if pipes increase the versatility of file manipulation just imagine what it can do to data streams.
Now don't get all worried. Data streams don't have to be all that complicated. It can be something as simple as using "cat" to display the contents of a file to your screen. That streamed file one to the screen as we've seen before. They can also use cat to redirect the contents of the file to another file like this. As we've seen many times already, you can also pipe data from one program to another where the pipe symbol is produced by hitting the shift and back slash keys together. Let's use the cat program to stream the contents of the syslog file to the grep program to print to the screen only those lines containing the words eth0, which is of course, the name of my first network interface.
Working with Linux data streaming
Linux categorizes all data flows as one of three file descriptors stdin, standard input, which is designated by the number zero, even if isn't technically a number. Stdout, standard output identified by one and stderr, standard error, identified as two.
Standard input obviously is the data that is input into a program while standard output is the data that's output from a program. Let's illustrate this.
Let's run the program "tail" which if you'll remember outputs a file's last lines against syslog and redirects the contents using one and forward arrow to a new file called say, "log data." The number one tells Linux to redirect the standard output meaning whatever lines of the syslog file tail delivers to become the contents of the file log data. You don't actually need to include the one in this case, since one happens to be the default via the redirect sign anyway. If you wanted to simply append the contents of syslog to the end of the current contents of the file without overriding whatever's there already you would use two forward arrows like this. That's how standard output works.
Let's see what standard error will produce for us. We'll cat our file one file redirecting the standard error output to a file called errors. Viewing errors shows us that the file is still empty. That makes sense because if the previous cat command worked it wouldn't have produced an error. Now let's try to cat a file that doesn't actually exist and view the errors file once again. This time there's something there.
Linux doesn't restrict a stream to a single output target: Tee allows you to send data off in two separate directions at the same time. This command will list using the long form the contents of the current directory on the screen but also add them to the file list.text.
Finally, you can redirect data streams as input to the xargs program. xargs will execute commands like rm in cases where you need more complicated filters than the command itself would normally allow. From our root directory and using sudo let's use find to search for all files with a .c extension but pipe that list to grep that will filter only those files that contain a text string, "stdlib.h."
Let's review. You can redirect a stream to overwrite the contents of a file using the forward arrow. Two forward arrows will append the new text to the file's existing contents. The pipe symbol will stream data to the program to the right of the pipe. There are three file descriptors: standard input, standard output, and standard error. One and forward arrow will direct standard output, while two and forward arrow will direct standard error. Tee directs data to multiple streams and XRX controls command execution on pipe data.
David taught high school for twenty years, worked as a Linux system administrator for five years, and has been writing since he could hold a crayon between his fingers. His childhood bedroom wall has since been repainted.
Having worked directly with all kinds of technology, David derives great pleasure from completing projects that draw on as many tools from his toolkit as possible.
Besides being a Linux system administrator with a strong focus on virtualization and security tools, David writes technical documentation and user guides, and creates technology training videos.
His favorite technology tool is the one that should be just about ready for release tomorrow. Or Thursday.