Sunday, March 6, 2011

Turn an online webcam into a video log

Opponents of the transport of huge "megaloads" of oil-refining equipment through the wild mountains of Idaho and Montana need a way to monitor the progress (or not!) of the loads.  The Idaho and Montana Departments of Transportation operate webcams around the state that are available on their websites.  Volunteers have been staying up all night watching these webcams to keep tabs on (and potentially capture images of) the megaload convoys.

I wanted to relieve the night watchmen with an automated capture.  The method I used is outlined below.

Note:  This method is Macintosh-specific, though the principles could be applied in Linux or Windows.  If you craft a solution for other platforms, please include it in a comment on this post.

My plan was to capture the image (or the whole web page) into a file four times per hour, since the web-available images are updated every fifteen minutes.  I figured I needed a capture program, and a way to call it at intervals.


Capture the Page

The first step was easy: Googling "mac webcam capture" lead me to Paul Hammond's excellent script, "webkit2png".  The script requires the "python" programming language and the "PyObjC bridge" software, both included in Mac OS X 10.5 and later, but you don't have to know a thing about python or PyObjC to use webkit2png.  When you run from the command line, it accepts a URL, fetches the web page, and converts the entire page to a "png" image file, which is widely supported by browsers and other graphics programs.

To run this in an automated way, each file generated needs a unique file name.  webkit2png offers an option to append the date to the output filename.  Looking at the script, it wasn't hard to find the function that fetches the date.  I googled that function in python and found out how to modify it to include the time as well as the date.  With that one six-character mod, webkit2png was good to go for me.  [Note that this was simplified for me by the fact that webkit32png is a script, not a compiled program.  It is often takes less knowledge to modify a script than a program.]

The command I ended up with is:
python ~/bin/paulhammond-webkit2png-9c4265a/webkit2png -Fd -o Lolo -D ~/Desktop/+Lolo http://rwis.mdt.mt.gov/scanweb/lolo.shtml 
  • python is the program which will "interpret" (run) the script
  • ~/bin/....webkit2png is the script to run
  • -Fd tells webkit2png to append the date-time to the output filenale
  • -o Lolo gives the "stem" (first part) of the output filename
  • -D /Desktop/+Lolo is the directory into which to put the output file
  • http://...lolo.shtml is the URL to capture.
Launch At Intervals

The classic way to launch a command at certain times in Unix-derived systems like Mac OS X is to use the built-in program cron.  [Though Apple now uses and recommends their "launchctl" program, citing extended features, I'm an old Unix hand, and I don't need extended features, so I went with what I know.]  cron "wakes up" once per minute, and reads a control file (called crontab) that tells it what commands to run at specific times.    

crontab has a somewhat arcane syntax, so rather than relearn it, I downloaded Cronnix, which provides a convenient dialog box (in a "simple" or "advanced" mode) to create the entries.  In Cronnix I use:
  • min: 8,23,38,53 - one minute after the image is updated by the webcam
  • hour: 0-6, 18-23 - only record from midnight to 6am and 6pm to midnight (the megaloads only roll at night)
  • mday:  * - any day of the month
  • month: * - any month
  • Wday: * - any day of the week
  • Command: - as above
I clicked Cronnix's "Save" button, and that was it!

Notes
  1. The particular webcam I am logging uses a static URL for the actual image, so I could have fed the img "src=" URL to webkit2png instead of the web page URL.  In fact, any URL that return a file type that can be interpreted by webkit could be used.
  2. There are many other ways to accomplish the same thing.  This one was pretty simple for me to put up.  YMMV.



Monday, October 11, 2010

Emails with attachments don't appear in document libraries

Emails to a SharePoint 2007 document library have some "gotchas" when it comes to attachments. Filenames and encoding can both cause silent failures; the email appears to be sent correctly, but the doclib is unchanged, and no error appears to the user, either via return email or on the website.
Filenames may contain characters that are valid in your desktop OS, but are not valid for Sharepoint. Microsoft lists the bad guys in this Knowledge Base article:
Information about the characters that you cannot use in sites, folders, and files in SharePoint
I follow this simple rule: don't use punctiation other than underscores in filenames, and don't put an underscore first.
Encoding of attachments may also be an issue. Testing on our SharePoint 2007 site indicates that attachments in the AppleDouble format (often the default for email clients on the Macintosh) cause the email to fail silently. Be sure to "encode for Windows" if sending from a Mac email client. [Note that GMail uses MIME/Base-64 encoding, which SharePoint accepts.]
Enhanced by Zemanta

Wednesday, July 21, 2010

SharePoint Overview

Posted to sharepointdiscussions@groups.yahoo.com in response to a thread about SharePoint vs. Google Apps for a small church organization:

 I think about SharePoint in the following terms (YMMV):

SharePoint at its core is web-based file sharing. Conventional LAN file shares have some major drawbacks:  you have to be on the LAN to access them, and it's difficult to include and utilize metadata. and.  SharePoint began by addressing these two issues.  [Today's SharePoint also includes document management features like global search and workflow.]

If you ask, "Why do people share files?", the short answer is, "They're collaborating".  So SharePoint includes collaboration features like check-out/check-in, announcements, calendars, and discussions.  This is the basis of the "teamsites".

It is common in today's project-oriented workplace for people to work on several project teams.  Some integrated view of multiple teamsites is needed, so SharePoint includes portal features like MySite.  [Today's SharePoint also includes "social" features like "Colleague Tracker".]

Now that SharePoint is collecting all those eyeballs, it becomes attractive as a business application platform.  Data views, Business Data Connector, Key Performance Indicators, and so on follow.

With so much web-based business function co-located on the SharePoint platform, it is also attractive to integrate content management;  hence the Publishing feature.

So you can consider SharePoint an integrated suite of web-based applications for file sharing, collaboration, business information, and publishing.  If all you need is file-sharing, there are simpler alternatives.

Alex is exactly right:  details are king.  The problem is that the details of the work that people need to do are often hidden from the person who makes the decisions about what platform they will use.

-- Joshua

Enhanced by Zemanta

Thursday, April 29, 2010

Controlling access to content for anonymous users?

sharepointdiscussions : Message: RE: [sharepointdiscussions] How do you control access to content for anonymous users?

This post discusses a simple way to deny access for anonymous users to particular pages within a public publishing site.  [Sorry, Yahoo registration required.  I thought it was public :-( ]

Wednesday, April 14, 2010

Moving A Site - "Export ran out of memory"

Another hitch in the site migration saga (see my earlier post).  Some sites I am migrating reach the "Compressing File(s)" stage, then error out:

FatalError: Export ran out of memory while compressing a very large file. To successfully export, turn compression off by specifying the -nofilecompression parameter.
So the really big .cmp/.cab files I was creating can't be compressed.  The solution suggested in the error message is not attractive, since these are files that have to be transmitted over the network.  My solution:  cut the "-cabsize" parameter on the "stsadm -o export" command back to "60". The export ran to completion,  the SharePoint Designer "Restore website..." operation was successful, and the site actually looks correct.

The one gotcha I saw had to do with the "60" (megabyte) limit on the size of the .cmp/.cab files.  This particular site was exported as 6 files, with sizes (in order of creation) of 89, 86, 82, 64, 68, and 56 megabytes.  Examining the contents of the 89MB .cmp file (by simply changing the file extension to .cab and double-clicking), showed that the last file added to the .cab was 47MB (an .mp3).  Probably the export process doesn't look ahead at file sizes to determine when to switch to building the next .cab file, but waits to detect the occurrence of the overflow.

So admins migrating SharePoint sites with (even moderately) big files, beware!

Monday, March 22, 2010

Moving A Site - "Restore Did Not Complete Successfully"

I have been migrating sites from one SharePoint farm to another by means of making a "backup" or "export" from the source server and doing a "restore" or "import" on the destination server.  This is not as simple as it seems, since Microsoft is determined to make it confusing (which is not far from making it just plain difficult).

Friday, February 19, 2010

Using [Today] or [Me] in SharePoint calculated columns

One would like to be able to use dynamic information like [Today] or [Me] in calculated columns (for example, to compute "Age" from "DateOfBirth"), but SharePoint complains.

Steve Eagleson was kind enough to point out this article from "Novotronix":

Novotronix Blog - Using [Today] or [Me] in SharePoint calculated columns

which contains a workaround which seems to... um, work.

The workaround involves creating a temporary column called "Today", then creating the formula, then deleting the temp column. This suggests that SharePoint actually knows how to use the dynamic variable, but the parser rejects it when the formula is saved. Very odd...