Linux / OS X Newbie Tips – Tea-Driven Development https://blog.mattwynne.net Matt Wynne taking it one tea at a time Wed, 21 Aug 2019 13:05:20 +0000 en-US hourly 1 https://wordpress.org/?v=6.2 165828820 Pain-free and fun password-less SSH with ssh-forever https://blog.mattwynne.net/2009/08/16/pain-free-and-fun-password-less-ssh-with-ssh-forever/ https://blog.mattwynne.net/2009/08/16/pain-free-and-fun-password-less-ssh-with-ssh-forever/#comments Sun, 16 Aug 2009 00:51:20 +0000 http://blog.mattwynne.net/2009/08/16/pain-free-and-fun-password-less-ssh-with-ssh-forever/ Continue reading "Pain-free and fun password-less SSH with ssh-forever"

]]>
A while ago I wrote a tip that showed you how to copy your public SSH key to a remote server to allow you to login without entering a password every time. I’ve since put this into a script that I use, and today I got sick enough of copying that script onto yet another machine that I packaged it up as a gem, so now we can all use it.

It works just like the plain old ‘ssh’ command, but this time you’ll never have to enter your password again:

ssh-forever username@yourserver.com

Your key will be generated (if necessary), copied to your server, and you’ll be logged in as normal.

Installation

gem sources --add http://gemcutter.org
gem install ssh-forever

Example:

[matt@bowie ssh-forever (master)]$ ssh-forever mattwynne@mattwynne.net
You do not appear to have a public key. I expected to find one at /Users/matt/.ssh/id_rsa.pub
Would you like me to generate one? [Y/n]y
Copying your public key to the remote server. Prepare to enter your password for the last time.
mattwynne@mattwynne.net's password:
Success. From now on you can just use plain old 'ssh'. Logging you in...
Linux broncos 2.6.29-xeon-aufs2.29-ipv6-qos-grsec #1 SMP Thu Jul 9 16:42:58 PDT 2009 x86_64
  _
 | |__ _ _ ___ _ _  __ ___ ___
 | '_ \ '_/ _ \ ' \/ _/ _ (_-<
 |_.__/_| \___/_||_\__\___/__/

 Welcome to broncos.dreamhost.com

Any malicious and/or unauthorized activity is strictly forbidden.
All activity may be logged by DreamHost Web Hosting.

Last login: Sat Aug 15 17:24:17 2009
[broncos]$

Why?

Because I can never remember how to do it by hand. Now I don’t have to, and nor do you.

]]>
https://blog.mattwynne.net/2009/08/16/pain-free-and-fun-password-less-ssh-with-ssh-forever/feed/ 11 155
Quick and Easy Password-less SSH Login on Remote Servers https://blog.mattwynne.net/2009/01/13/quick-and-easy-password-less-ssh-login-on-remote-servers/ https://blog.mattwynne.net/2009/01/13/quick-and-easy-password-less-ssh-login-on-remote-servers/#comments Tue, 13 Jan 2009 17:34:27 +0000 http://blog.mattwynne.net/2009/01/13/quick-and-easy-password-less-ssh-login-on-remote-servers/ Continue reading "Quick and Easy Password-less SSH Login on Remote Servers"

]]>
Like so many posts in this category, this is surely child’s play to you linux aficionados. For those of us mere morals though, this is a very useful little trick, and it shows how you can easily move data from your local workstation to a remote server using SSH.

If you don’t already have a public / private key pair on your local workstation do this:

ssh-keygen -t rsa

If you have no idea what I’m talking about, try looking for this:

ls ~/.ssh

Did you see anything? You’re looking for a file called id_rsa.pub, I would guess.

Now that you have generated your key, to copy the public key part up to the remote server, do this:

ssh remote-user@remote-server.com “echo ‘cat ~/.ssh/id_rsa.pub‘ >> ~/.ssh/authorized_keys”

You should now be able to dance around the inner bits of the internet to your heart’s content.

Thanks to Dan Lucraft for the technology behind this post.

]]>
https://blog.mattwynne.net/2009/01/13/quick-and-easy-password-less-ssh-login-on-remote-servers/feed/ 1 95
Ubuntu Eee – The OS Your EEE Should Have Been Born With https://blog.mattwynne.net/2008/10/27/ubuntu-eee-the-os-your-eee-should-have-been-born-with/ https://blog.mattwynne.net/2008/10/27/ubuntu-eee-the-os-your-eee-should-have-been-born-with/#respond Mon, 27 Oct 2008 00:18:47 +0000 http://blog.mattwynne.net/2008/10/27/ubuntu-eee-the-os-your-eee-should-have-been-born-with/ Continue reading "Ubuntu Eee – The OS Your EEE Should Have Been Born With"

]]>
On finishing a long contract and project at the BBC a few months ago, I was incredibly touched to be given a brand new Asus EEE PC as a leaving gift by my colleagues.

Although I loved the tiny form factor and take it with me practically everywhere, I was never quite satisfied with the default Xandros Linux and have fidgeted around ever since trying out different options, spending way too much time on the on the excellent eeeuser.com community site, zapping the flash drive with different distros.

Finally this evening I think I found the answer: http://www.ubuntu-eee.com/

Slick, easy to install, great looking, and of course a proper operating system under the hood. Props to the team who put this together, it’s terrific. If you have one of these little beauties yourself, I highly recommend checking it out.

]]>
https://blog.mattwynne.net/2008/10/27/ubuntu-eee-the-os-your-eee-should-have-been-born-with/feed/ 0 83
Danger: MacPorts Breaks Your Rails/Ruby Relationship https://blog.mattwynne.net/2008/07/31/danger-macports-breaks-your-railsruby-relationship/ https://blog.mattwynne.net/2008/07/31/danger-macports-breaks-your-railsruby-relationship/#comments Thu, 31 Jul 2008 14:06:48 +0000 http://blog.mattwynne.net/2008/07/31/danger-macports-breaks-your-railsruby-relationship/ Continue reading "Danger: MacPorts Breaks Your Rails/Ruby Relationship"

]]>
So I just spent a fun morning trying to work out why so many of our unit tests were breaking on my machine with the message:

undefined method `[]' for #<Enumerable:

I turns out that Ruby 1.8.7 doesn’t run Rails very well, and if you take the time to read the small print it’s clear you really need to just run 1.8.6.

Trouble is, the wonderful MacPorts is now installing 1.8.7.

I have been fumbling around for the past few days installing/building various versions of things like memcached to get our app to work on my OSX 10.4. It’s been a lonely furrow to plough – all the other devs in the office are hard-core emacs / ubuntu heads, and it seems like most Mac people have moved on to Leopard these days…

At some point I must have made the mistake of running

# DO NOT RUN THIS!
sudo port upgrade ruby

So be warned, people.

]]>
https://blog.mattwynne.net/2008/07/31/danger-macports-breaks-your-railsruby-relationship/feed/ 1 67
Fetch and Parse HTML Web Page Content From Bash. Wow. https://blog.mattwynne.net/2008/04/26/fetch-and-parse-html-web-page-content-from-bash-wow/ https://blog.mattwynne.net/2008/04/26/fetch-and-parse-html-web-page-content-from-bash-wow/#comments Sat, 26 Apr 2008 22:47:57 +0000 http://blog.mattwynne.net/2008/04/26/fetch-and-parse-html-web-page-content-from-bash-wow/ Continue reading "Fetch and Parse HTML Web Page Content From Bash. Wow."

]]>
Okay, this is another one of those linux newbie posts where I tried to figure out how to do something that’s probably really obvious to all you seasoned hackers out there.

Anyway here I go clogging up the internet with a post that somebody, somewhere will hopefully find useful.

Are you that person? Well… have you ever used the shell command curl to fetch a web page? It’s cool, isn’t it, but you do end up with a splurge of ugly HTML tags in your terminal shell:

Eugh!

So… how about we parse that HTML into something human-readable?


Enter my new friend, w3m, the command-shell web browser!

If you’re using OS X, you can install w3m using darwinports thusly:

sudo port install w3m

Linux hackers, I’m going to assume you can figure this out for yourselves.
So, with a brand-new blade in our swiss-army knife, let’s pipe the curl command into the standard input for w3m and see what happens:

Hmm… two problems here: because I’ve grabbed its output and piped it off to w3m, curl has started blethering on about how long it took. I can fix that with swift but ruthless the flick of a -s switch to silence it. How about all that raw HTML though – I thought this w3m thing was supposed to parse my html, not just regurgitate it?

It turns out that w3m assumes its input is of MIME-type text/plain, unless told otherwise. Let’s set the record straight:

Aw yeah. Now we’re talking. Old-skool green-screen meets nu-school interweb. It’s like being back on the BBS network of yore.

What’s the point of all this? Well, that’s up to you. I have a couple of ideas, but you’re going to have to start coming up with your own you know. Why are you reading this anyway? Haven’t you got anything better to do?

]]>
https://blog.mattwynne.net/2008/04/26/fetch-and-parse-html-web-page-content-from-bash-wow/feed/ 11 56
Saving Your WordPress Blog to CD https://blog.mattwynne.net/2008/04/11/saving-your-wordpress-blog-to-cd/ https://blog.mattwynne.net/2008/04/11/saving-your-wordpress-blog-to-cd/#comments Fri, 11 Apr 2008 10:42:04 +0000 http://blog.mattwynne.net/2008/04/11/saving-your-wordpress-blog-to-cd/ Continue reading "Saving Your WordPress Blog to CD"

]]>
So the wife has been writing her mandatory university course diary as a wordpress blog, but now she needs to hand it in.

> Can you put it on a CD for me?
She asks.

Unix to the rescue!

Following this excellent article I had the site saved down to disk in a jiffy, with all links modified to work offline, all images and CSS files copied down.

For your reference, here’s the command I used.

    wget --mirror -w 2 -p --html-extension --convert-links -P -H -Dwordpress.com ~/path/to/save/locally http://yourblog.wordpress.com

Quoting Jim’s article for the meaning of the command line options:
> –mirror: specifies to mirror the site. Wget will recursively follow all links on the site and download all necessary files. It will also only get files that have changed since the last mirror, which is handy in that it saves download time.
>
> -w: tells wget to “wait” or pause between requests, in this case for 2 seconds. This is not necessary, but is the considerate thing to do. It reduces the frequency of requests to the server, thus keeping the load down. If you are in a hurry to get the mirror done, you may eliminate this option.
>
> -p: causes wget to get all required elements for the page to load correctly. Apparently, the mirror option does not always guarantee that all images and peripheral files will be downloaded, so I add this for good measure.
>
> –html-extension: All files with a non-html extension will be converted to have an html extension. This will convert any cgi or asp generated files to html extensions for consistency.
>
> –convert-links: all links are converted so they will work when you browse locally. Otherwise, relative (or absolute) links would not necessarily load the right pages, and style sheets could break as well.
>
> -P (prefix folder): the resulting tree will be placed in this folder. This is handy for keeping different copies of the same site, or keeping a “browsable” copy separate from a mirrored copy.

I’ve also added my own at the end of Jim’s version:

-H -Dwordpress.com

These options tell wget to recursively fetch any file within the .wordpress.com domain – otherwise the stylesheets and images for the blog, which are stored in different subdomains of wordpress.com, will not be downloaded.

]]>
https://blog.mattwynne.net/2008/04/11/saving-your-wordpress-blog-to-cd/feed/ 9 47
Use Rsync to Copy Your ASP.NET Website https://blog.mattwynne.net/2008/03/23/use-rsync-to-copy-your-aspnet-website/ https://blog.mattwynne.net/2008/03/23/use-rsync-to-copy-your-aspnet-website/#comments Sun, 23 Mar 2008 01:34:29 +0000 http://blog.mattwynne.net/2008/03/23/use-rsync-to-copy-your-aspnet-website/ Continue reading "Use Rsync to Copy Your ASP.NET Website"

]]>
If you’ve ever tried to copy the source files from a Visual Studio 2005 ASP.NET solution, especially if you’re using TFS and Resharper, you’ll have probably noticed all great steaming heaps of fluff and nonsense these tools leave all over your hard drive. Not to mention all the built assemblies lurking in your bin/Debug folders.

If you have a unix/linux/apple machine handy, or have at least had the sense to instal cygwin or coLinux on your quaint old PC, then give this a rumble.

Prepare a file with the inclusion / exclusion rules for the files / folders you want to copy. For my solution, it looks like this:

  • *.cs
  • **/lib/.
  • *.csproj
  • *.sln
  • *.resx
  • *.jgp
  • *.gif
  • *.gif
  • *.aspx
  • *.ashx
  • *.js
  • *.ascx
  • *.config
  • *.xml
  • *.txt
  • *.html
  • *.htm
  • *.dbp
  • *.sql
  • */
  • *

There are a few fascinating points to note about this file:

  • The lines that begin with a ‘+’ character are include rules.
  • The second line copies anything in a /lib folder, which is where we keep dependencies in my world
  • The second-to-last line tells rsync to copy every folder. Without this, the last line would exclude all the folders (rsync works its way through the folder tree recursively) and you’d never get to the files you’d mentioned above. Read that bit again, it’s hard to understand at first, but important.
  • The last line excludes everything else.

Now fire up your terminal, and invoke the magic of rsync:

rsync -av –exclude-from=files.txt source/Projects/Widgets/ target/Projects/Widgets/

Which means

  • copy all files (recursively)
  • write verbose output

And there you have it. Yet more unix joy.

]]>
https://blog.mattwynne.net/2008/03/23/use-rsync-to-copy-your-aspnet-website/feed/ 2 44