G$earch

How to Browse Your Facebook Like It’s Pinterest [Quicktip]

Posted by Harshad

How to Browse Your Facebook Like It’s Pinterest [Quicktip]


How to Browse Your Facebook Like It’s Pinterest [Quicktip]

Posted: 13 Apr 2012 07:37 AM PDT

I’m sure you know by now about Pinterest, which is just about one of the fastest growing social networking sites ever. It connects people not only based on their friend’s network, but by their shared interest. While Facebook has great virility, Pinterest offers greater content exposure to its targeted audience, and this enhances the content-sharing experience. Timeline for Brands on Facebook may have its benefits and appeal, but if you secretly wished that it was just as easy to use as Pinterest, then we have good news for you.

pinvolve Hongkiat How to Browse Your Facebook Like Its Pinterest [Quicktip]

In this quick-tip, we will guide you on how to use Pinvolve, a Facebook app that will automatically rearrange and polish your content presentation to mimic the styles of Pinterest. On top of that, the app will add a pin button to each of your shared content to make it easier for your fans to share your featured contents to Pinterest.

Convert your Fan Page into Pinterest style layout

To get started with the Pinterest-like layout, go to the Pinvolve Facebook app page and click on the button ‘Add to my page’.

pinvolve start How to Browse Your Facebook Like Its Pinterest [Quicktip]

You will be given an option to select which page you want to add this feature to. Assuming you have more than one Facebook Page, then you need to select the pages you want to convert, and click ‘Add Page Tab’.

pinvolve add page tab How to Browse Your Facebook Like Its Pinterest [Quicktip]

That is all you need to do, and now your page is ready to be viewed in Pinterest-style. You can view our page which went through the Pinvolve treatment here.

pinvolve Hongkiat How to Browse Your Facebook Like Its Pinterest [Quicktip]

At this point, you will notice from the main Facebook Brand Timeline page, a new icon tab just below the cover photo like so:

pinvolve Hongkiat timeline How to Browse Your Facebook Like Its Pinterest [Quicktip]

Any time you need to view your Timeline in Pinterest-style, just click on the Pinvolve tab.

Conclusion

Facebook already has the virility to reach large audiences, but with the Pinvolve app, you could expand your reach to Pinterest users as well, and increase your brand awareness across two booming social networks.

Related posts:

  1. How to View Facebook Photos, Pinterest Style [Quicktip]
  2. How to Protect Your Pictures from Pinterest Pinnings [Quicktip]
  3. How to Rename Facebook Page Vanity URL [Quicktip]
  4. How to Listen to Music with Friends on Facebook [Quicktip]

Basic Shell Commands For Bloggers

Posted: 13 Apr 2012 04:01 AM PDT

The system of shell commands is one of the oldest languages for systems communications. Computers have been able to access command line prompts since the very early days of electronic computing, even before Operating Systems were fully developed. Now in mid 2011 it’s clear to see how fast we have evolved.

For tech enthusiasts and bloggers understanding some basic shell commands can save you a lot of time. Understanding how to interact with the terminal and write command line statements to perform tasks is such a huge topic. By no means should you expect to fully understand the discussions here on your first try. But if possible spend a bit of time researching and putting together the knowledge of using a Command Line Interface.

shell command for bloggers Basic Shell Commands For Bloggers
(Image source: n3wjack)

I’ll be sharing some great tips below for bloggers across the world. Keep in mind that any GUI you’ll be using to access a computer’s files probably has some form of command line. This is the basis for all computing, to input commands and receive direct output. Before jumping into the commands and syntax I suggest we clear up some history first.

Linux Shell Command – In a Nutshell

There are so many terms being used here it may help to clarify some. Below I have included a couple descriptors for some slightly controversial vocabulary.

  • shell – a basic program which takes user input and executes commands. shell is usually a generic term referring to any command-line interface.
  • terminal – the connection made between end user and a computer system.
  • Bash – a type of shell scripting which is most popularly used in Linux environments.
  • command – input issued into the computer with a set task or list of instructions.
  • kernel – internal software written into the core of most Operating Systems. The kernel can be given commands via any shell window to handle physical computer processes. ie. memory allocation, hardware, external devices, CPU functionality, etc.

Importantly take note that this system has been around for a very long time. In fact the command line functions between Linux and Mac OSX are for the most part identical. This is because Linux was built as a free open sourced Operating System off a Unix-base operating system. Meanwhile Apple originally built OS X off BSD, which is a Unix system.

Windows stands as the odd-guy out having been written over classic DOS (Disk Operating System). Some commands are similar, but for the most part any command line interaction with a Windows OS will be much different from a Linux/Unix system.

Opening a New Terminal

The Terminal window is the black box with a blinking cursor eagerly awaiting your input. This can be brought up via any GUI menu or also assigning shortcut commands. In the Linux GUI you’ll be looking for an application named terminal or konsole. Consult online with your Linux release documentation for specifics, such as Ubuntu or Debian.

terminal screenshot Basic Shell Commands For Bloggers

If you are in a Mac environment the quickest way to bring up a terminal window is through Spotlight. command + option + space bar will open a brand new spotlight search, or you can also click the magnifying glass for a dropdown panel. Inside type “terminal” and the results list should populate quickly.

Getting Started

Now that you’ve got a terminal window open we can get started! To begin you’ll want to understand navigating the directories. pwd is a listing command to output your active directory. Coupled with ls you can parse the current directory and return a files list. The former command stands for Print Working Directory while the latter represents List Files/Directories. Both of these are fun to play with and won’t damage or edit any files.

When you are dealing with a returned file listing you should notice a few things. Firstly the lists will include both single files and directories. Any listing without a document extension (.jpg, .gz, .rpm) is considered a directory. You can move up and down between these with the cd command. This stands for Change Directory and should work as you expect.

A shortcut for maneuvering one directory upwards uses cd ../ – The beauty of this trick is how quickly you can navigate back between directories and find exactly what you’re looking for. Every time you move up a level call pwd to see where you’re at. If you are looking for a specific folder also call ls so you can get an idea of where to move next.

shell command pwd Basic Shell Commands For Bloggers

To navigate within the root directory simply append a forward slash to the URL. For example if you are currently within your home directory it’s not required that you move up directories until you reach home. Simply call cd /home and hit enter to move into your root home directory.

Manipulating Files and Folder

Now that it’s possible to traverse the inner workings of your file system, we should get into building files. If you aren’t a fan of the Graphical User Interface for making directory paths look no further than our simple command line. mkdir stands for Make Directory and is the quickest way to build a solid file structure.

If you’re logged in as root then you’ll have no problems messing around. Be on alert, however, as sometimes file permissions can be overly-strict and limit your access to create new directories. Check out the mkdir documentation page for examples of arguments.

To cover this further each command comes with a set of possible arguments. These can be passed after typing the command to apply additional settings. A universal example is --help which always displays a list of features and support topics for the current command. Try typing in mkdir --help and see what you get back.

The cp and mv commands are used to copy and move files, respectively. You’ll need to have both directories already written out and pointing towards where the file will go. Each command requires 2 arguments, the first being a file of choice and the second a new destination to copy or move into. Similarly rm filename can be used to delete (remove) files and rm -rf directory_name/ to remove directories. But be careful here as there is no undo feature within shell!

Matching Wildcard Patterns

Being able to move files and copy folders does provide a convinience. But ultimately putting this knowledge to good use requires a bit more finesse. Originally you’d be using shell scripting to automate large tasks which you’d rather not handle yourself.

With wildcard commands you’ll be able to target multiple files instead of a single name. When typing in your target URL there are two additional symbols to play around with. An asterik(*) is used to denote any number of wildcard characters, while a question mark(?) denotes any single character.

Brackets can also be used to denote patterns. Within a set of brackets you can define limits to the characters or possible wildcard matches. By imposing a set of colons [::] both before and after the brackets you can choose from a handful of precursors. These include [:alnum:] for alphanumeric and [:alpha:] for alphabetic characters only. If you’re looking to only target numerals [:digit:] works just as well.

This whole system seems abstract without examples, so I’ve provided a few below.

  • a* – matches all file names beginning with the letter “a”
  • foo*.txt – matches all text files beginning with the letters “foo”. Note this will only return text files, even if you have other folders beginning with foo
  • photo?? – matches all files and folders which begin with the word photo and follow up with exactly 2 more characters
  • [xyz]? – matches any filename beginning with x, y, or z and followed by exactly 1 more character

I think you get the point here. The wildcard system is very complex, it’s certainly not for the faint of heart. Do not expect yourself to fully understand the capacity here after just spending one day in terminal. It takes a lot of practice and repetition to get well-versed in shell scripting and wildcard callouts. Review the Tux Files info page for some more examples and information.

File Compression and Storage

Building and creating archive files is just part of the modern computer experience. I am frequently e-mailing and downloading new .zip archives each day. These contain graphics, icons, library code, fonts, Photoshop mockups, and a whole lot more. The act of archiving directories not only reduces file size but makes transport a lot easier.

When working within Linux/Unix there are a few commands you can use to archive data. The two frequently touched upon are zip and gzip. The differences aren’t very extreme and certainly not notable enough to require one over the other. They are just different mechanisms for compression, data storage, and file schemas.

Each of these commands features a wonderful platter of possible arguments. You can view the complete list from About’s Linux info page on the zip command. zip -r is possibly the most widely known shell statement which stands for recursively pulling up all files and zipping them together. This means if you specify a command such as zip -r myfolder newarchive you’ll be pulling all the files out of myfolder and adding them into a new archive named newarchive.zip. Without -r you would need to specify each individual file name in a bracket list format [file1.jpg file2.jpg etc]. Talk about shaving off time!

Now the command for gzip works very similarly and shares a lot of the same arguments. The choice to use gzip over zip is really a personal one and will not interfere with any of your file structures. If you’re moving files between different operating systems I recommend sticking with .zip as it is more accepted in the Windows community. But we live in an age of bountiful software and open source projects, so it’s not truthful to say Windows can’t handle .gz archives. But the archive file format just isn’t as popular.

When receiving zipped archives you can also unzip them into new directories solely from the command line. Both unzip and gunzip are the counterparts to their original archive commands. Similarly the list of arguments is just as long, if not longer. However the basic unzip command only needs a file location to perform action. If you are comfortable working with archive software this method should be exactly the same in any Mac OS X environment.

Working as a Super User

If you are working with the terminal a lot then super user access will come in handy. Especially as a web developer or blogger, as you’ll find permission errors become extremely annoying after the third or fourth time.

Certainly it’s possible to directly log into the root account and run terminal commands from there. However this is understood as bad practice in the Linux realm, as the root user should only be used in an emergency to fix or correct a system failure. Or if you just happen to forget your main login password!

Now to get into the system as super user you will need the root password. In your terminal window simply type su and hit enter. This stands for substitute user and without any further arguments will assume you’re looking to access root. Type in the password and hit enter, you should be directed into a new line running under root@yourcomputer. To revert back into your account use the exit command.

Now this does work well for the majority of Linux/Unix systems. But as you work on a Linux box running Ubuntu or a similar OS you’ll notice changes to the super user interface. Instead Ubuntu users will be working with a command sudo which replaces super user access for just a single command.

This means you won’t be logged into terminal as super user, but can run any command as super user by appending the prefix sudo. Take note that Ubuntu is a choice OS which uses the sudo command. Apple’s OS X terminal is another system which capitalizes on the sudo super user command. After you hit enter you’ll be asked again to input your root password, and afterwards the command will execute and return you to a new line if successful.

Taking Ownership over Files

Yet another issue with permissions stems from file access. I can’t imagine how many times I’ve been working on file changes but haven’t been able to apply them because of insufficient permissions. You’ll want to perform any ownership changes under root, if possible.

The command chown for Change Owner is fairly straightforward and works in most all Linux and Unix environments. For Ubuntu users you will need to run sudo before any chown commands, unless you happen to be logged in as root.

There are only two individual arguments required to execute successfully. First you’ll need to enter the user name which will be granted file ownership, followed by a space and the file directory. The system will work out of your current working directory to choose the file. But if you’d like to bypass the overall hierarchy you can begin at root with a forward-slash in your URL.

The system of file ownership applies a lot more fruitfully in server maintenance. If you have shell access to a server you will certainly need to understand file manipulation and taking over file permissions. For example the installation of many common web scripts require edits to the database information. Taking ownership of these files will keep you out of harms way should a hacker get into the server console.

Putting it all Together

Now with all of these new commands you should start experimenting briefly in the console of your choice. A great place to start building your knowledge is of wildcards and selecting files from within your OS. As a DOS and Linux user myself I would suggest practicing with lighter commands at first, so as not to risk any damage to your files and directories.

Bad things can happen with the rm command and some faulty wildcard matches. If you’re planning on deleting anything try running your wildcard selectors under ls first. This will return the list of files you wish to delete, and if everything looks chummy you can always run the command right afterwards! In any terminal window simply press the up arrow key to bring back your last command input. Delete the ls and replace with rm then you’re good to go!

There is a lot you can perform within the command line. But there are also many things which you can’t. Keep humble with your usage and don’t go overboard just to become the technology king. You certainly can start using the CLI(Command Line Interface) for most all of your performance tasks. But quite honestly there are many things you can do quicker from a GUI perspective. If you research and play around with some commands you’ll quickly pick up which tasks perform well in terminal and which are best saved for a mouse and keyboard.

12 Shell Commands all Bloggers Should Know

1. Deleting Nested Folders

With the rm command you can remove (unlink) files and folders from your hard drive. But what about a whole lot of nested folders too? Especially if each folder set contains subsequent files and mismatched data. The option -r will recursively flip through all subsequent files and folders to remove the data and directories.

If you add in the -f option this forces the prompt to stay within your commands and not prompt you for any dialog. There is no return output and will bypass nonexistent files in all sub-directories. The whole command in action may look like this:

rmdir -r -f /home/you/documents/mydir1/2009

2. Connecting to a Database

When you are accessing a website backend system frequently you’ll want to ensure a safe connection is created. This goes double for database connections where website and user information is stored. But if you’re working with a local database install you can probably get away with a lot less security requirements.

Depending on the system you are using there will be different syntax to adjust. The basic call to connect into a database is still generally the same. You will need the name of the database you’re accessing, your username, password, and possibly the database hostname(usually localhost). I’ve added two shell commands to connect into, one for MySQL and the other for Sybase.

mysql -u myusername -h localhost -p

Here you would simply hit enter with no password provided. Then if the shell command successfully accesses that database and host it’ll prompt for your password. Enter this on the new line and hit enter again. MySQL will welcome you upon success.

isql -U myusername -P <<EOF use gdb_1 EOF

Sybase is another great example of database software. You can access these types of databases with the isql command similar to the mysql above. Here you are only providing a username and password, and then calling the use command to pick your database.

3. Backup a Database

Now that you’re connected into the database there are plenty of commands you could run. Ideally you’ll want to stick with simple SQL procedures and not go through adding new users or articles directly. But ever consider backing up your entire database structure? Well the commands are fairly complicated, but with 15-30 minutes of research you can probably figure them out.

Sybase is much more complicated and requires some heavy shell commands. If you check out Ed Barlow’s database backup scripts I’m positive you’ll be able to work with his packages no problem. He outlines some basic solutions to dump all database tables, dump error logs, database statistics, run logs, etc. It’s fairly robust and works well for almost anything you’d need.

MySQL databases are a similar fashion and require a fairly long shell script. The contents require choosing a local directory to save each backup and calling a for loop in BASH. This will loop through every database and pull out all tables as a .gz archive file using $MYSQLDUMP and $GZIP. The full code can be downloaded at nixCraft’s Shell Script article targeting MySQL dumps. Simply edit your database/login information and save as mysqlbackup.sh somewhere on your harddrive. You can run this manually or alternatively schedule a cron job for every day, week, month, etc.

4. Restore a Database

Now we come to restoring the backup of a database file. This isn’t as complicated as you might think, although from the looks of the previous code I can understand why. But consider that it’s a lot easier to upload previous files than to connect and pull down data from a remote server.

In Sybase you’ll be doing a lot more work in shell. But the basic command is load database dbname. You can follow this up with further options, and of course you’ll need to be connected into the database before this will work. If you’re stuck try using the Sybase documentation file as a reference point.

With MySQL you only need a single command if you’re already logged in. Or even if you aren’t you may connect and call the restore simultaneously. This is because the backup of any MySQL database file is basically SQL code which can reconstruct the database from scratch. This is the reason some backups are enormously large and oftentimes too big to upload via web interface like phpMyAdmin.

You can call the mysql command with a single line. As before you enter -u and -p but only fill in your username since your password is prompted afterwards. The code below should work perfectly:

mysql -u username -p database < /path/to/dump_file.sh

The only variables you’ll want to replace are username, database, and your backup path. The username and database host are the same as before when you connected. So you’ll only need to find where your database backup is stored so you can update it.

5. Direct Shell Downloads

The wget command is very interesting and offers a lot of options. GNU wget is a non-interactive utility to download files from the Internet. This includes standard HTTP, HTTPS, and FTP protocols in the mix.

To download a basic file you would type wget filename where filename is the location of your file. This could be anything online such as http://media02.hongkiat.com/v4s/n_logo.gif for the Hongkiat .gif logo file. If you create a shell script file holding many variables you can download large batch videos, images, music, or other content in the background while you work. And keep in mind you can use wildcards here such as * and ? to pull large directories of files.

Now you may also wish to download contents via FTP. However much of the time you won’t be working with public ftp servers and will need a username/password. The login syntax is a bit confusing, but I’ve added a small example below.

wget ftp://username:password@ftp.mywebsite.com/files/folder/*.jpg

6. Compress Folders

We had gone over compressions a bit earlier, but merely in description. There are some very basic primitive examples of file compression which you can call from the command line anywhere. I recommend using the zip command if you are new to Shell, only because the Linux system can get confusing. However if you’d like to use gzip or another alternative feel free.

Whenever you call a complete zip command you’ll want to include all the files within your new archive. The second parameter from a zip command is the folder you’d like, or alternatively a short list of files to zip. Adding the -r option recursively traverses your directory structure to include every file. Below is the perfect example of a small folder compression.

zip -r newfile_name.zip /path/to/content/folder

7. Mass Find and Replace

Whenever you have a large collection of files you’ll often have them labeled or numbered in a similar pattern. For example, with a large collection of website banners they may all include the ‘banner’ prefix or suffix. This could be mass replaced in all files with the shell sed command.

sed is a stream editor which is used to perform basic text transformations and edits on files. It is known as the most efficient command since it will sweep through a single directory almost instantaneously. Below is some example code using the command.

sed -i 's/abc/xyz/g' *.jpg

So above we would be matching for nonexistent files, but in our example we’re looking to replace a set of images. We look in the directory and plan to replace all .jpg images which contain abc and substitute xyz. With the -i option we can edit files in place automatically with no backup requirements. Have a quick peek at the sed documentation for more info.

8. Create New Files

It can be pesky to create a whole heap of the same files in one sitting. If you would like to create a large set of documents or text files without using software, the command line is a great tool. Consider some of the editors at your disposal directly from shell.

vi/vim is possibly the best and most useful editor for Linux CLI. There are others such as JOE text editor. You could also create a file from the cat command, although you’d be limited to only viewing file contents and not editing anything.

With vi you’ll only need to call a single line of code. I’ve added the code below which is simply vi command followed by your new filename. Once you are in vi editor type ‘i’ to edit and insert new text. To save and exit a file press the esc key followed by colon+x (:+x) and hit enter. It’s a strange combination, but it’s awfully secure and once you get the hang of things you never want to go back!

vi /home/you/myfile.doc

9. Shell Networking Tools

The shell command line offers quite a few tools for networking. The ping command can be used followed by an IP or web address to check the status of a website. A packet request is sent to the server and if you get a response back shell will output the time and server details. This can be useful to check if a website is down, or similarly if your Internet connection is down.

If you’d like to check your current settings call the ifconfig command. This is very similar to the ipconfig command in Windows DOS. But with shell ifconfig you are given a lot more options to configure and deploy custom DNS settings. A very similar command netstat is just as useful for displaying your current open ports and networks.

10. Package Management

When working with software installs via Shell you’ll be mainly working within 2 different versions of Unix. RPM Package Manager (RPM) and Debian Manager (DEB) are the most widely known versions. These are kept up to date with the latest packages which you can download from the closest mirror site.

The commands are very similar to install on either version. yum and rpm are the two commands reserved for the former package manager. Their code follows yum command package-name. So for example:

yum install package-name

For Debian/Ubuntu users you’ll be using the Debian Package Manager. Again the syntax follows a similar format where you call the package manager ID, the command, and follow it all up with a package name. The two examples below are formatted for an install and upgrade, respectively.

apt-get install package-name apt-get upgrade mypackage1

11. Generate List of Largest Files

Organization is what keeps you running at all hours of your work sessions. When you start to lose track of files and notice your directories getting too large it’s time for some Spring cleaning. The ls command is very useful in shell as it gives you a greater perspective into some of your directories.

This includes sorting specific types of files and file formats. If you’d like to find the biggest files in any directory on your HDD simply apply the command below.

ls -lSrh

There are 4 separate options attached to this command. -l is used to list full output data. -S will sort the entire list by file size, initially from largest to smallest. By applying -r we then reverse the sort order so the largest files in your output will end up at the bottom. This is good since the shell window will leave you at the very bottom of your output commands anyways, so it’s easier to clear through the list. -h simply stands for human readable output data so you’ll see file size in megabytes(MB) instead of bytes.

12. Create an E-mail On-The-Fly

If you are using any software for your e-mail accounts this command will save you loads of time. Often you know a person’s e-mail address you’re looking to send but don’t want to spend your time opening your mail client. the mailto: command will work exactly the same from command line as from any browser or website.

Even if you don’t know the address you’re looking to send, just add in anything. noreply@nothing.com works great! Or be creative with your own filler content. Either way after you type this hit enter to pop open a brand new e-mail message window with your sender address. You can modify your subject/body and CC’s to your own needs all within a quick instant.

mailto:noreply@hongkiat.com

External Resources

To continue down the road of shell scripting requires a lot of patience and dedication. There are hundreds of commands to understand and so many sub-categories to participate with. Spend some time messing around in a console window and see how you like the speedy workflow. Hopefully the links below can provide more information to keep you going on shell scripting within Linux and Mac OS X.

Conclusion

Having spent time on all 3 of the major Operating Systems I have to say they are all fantastic in their own regards. But if you are working off a Linux OS the terminal becomes just as important as any GUI could be. I feel it very important to recognize even the most basic commands and try working within the command line interface for practice.

Those of you new to the system will surely run into roadblocks. This is a normal part of the learning process, but keep your wits and never give up! Build yourself up to expect solid, realistic goals. Learning shell scripting will take a lot of work initially. But within a week or two you should have mastered some of the basics and begin using the terminal flawlessly (well, mostly). If you have secrets or awesome tips for working within the Linux/Unix CLI please share them in the discussions area below.

Related posts:

  1. 20+ Essential Tools and Applications For Bloggers
  2. Basic Guidelines to Product Sketching
  3. Networking Guide for Bloggers: Why It Is Important (Part 1)
  4. Networking Guide for Bloggers: Making a Good First Impression (Part 5)

The World of Lost Smartphones [Infographic]

Posted: 12 Apr 2012 11:36 PM PDT

If you have ever lost your smartphone then you know how much of a hassle things can become for the next few hours – passwords need to be changed, all your contacts need to be informed and their contact details retrieved one way or another, and you can say bye bye to your music collection or photo albums. All that information could be in a Lost & Found section somewhere in town, or worse yet in the hands of a stranger who knows how to break into your phone and use the information he can find inside. If you don’t have a security tool installed to protect the precious data inside your phone, you might be beating yourself up for your loss.

Well, this might give you some form of comfort: losing a smartphone happens a lot more than you may think. Rather than give you the figures, we have for you an infographic released by backgroundcheck.com. Derived based on information provided by the mobile security company Lookout, this visual depiction of The World of Lost Smartphones shows you where, how many and how often past users have been losing their smartphones all over the U.S. Check out the numbers, they might just give you a scare.

world of lost smartphones The World of Lost Smartphones [Infographic]

Spot an infographic you think will be a perfect fit here? Send the link to us with relevant details and we’ll credit you with the find.

Related posts:

  1. Evolution of Apple Products (2001-2011) [Infographic]
  2. 16 Ways How Educators Use Pinterest [Infographic]
  3. Wikipedia: Redefining Research [Infographic]
  4. What Happens in an Internet Minute [Infographic]

0 comments:

Post a Comment