Thursday, December 29, 2011

Shrinking Transaction Logs in SQL Server 2005/2008 Using T-SQL

A tripping point of database development in SQL Server 2005 and 2008 tends to be the management of the transaction logs.  When you get into a fairly complicated transactional database design, developers don't often worry too much about the transaction logs -- that is, until, the number of unused pages grows exponentially to the point where the transaction log hits its disk space limitation (if it has one), or worse yet, peg most of the physical disk space available, not to mention give your database an enormous performance hindrance.  


These tasks ought to be taken care of by a DBA (routine transaction log backups/truncations, etc), but developers may not have the luxury of DBA assistance in their test environments.

You can, of course, use SQL Server Management Studio, to assist with shrinking the transaction logs to free up disk space, but there may be times when you need to shrink the transaction as part of your T-SQL stored procedure processes.  Sample code follows:


CREATE PROCEDURE [dbo].[usp_ShrinkTransactionLog]
as

Checkpoint
DBCC OPENTRAN(SampleDatabase)

DECLARE @wk_fileid INT
SELECT @wk_fileid = fileid  
FROM sysfiles
WHERE [name] = 'SampleDatabase_log'

DBCC SHRINKFILE (@wk_fileid)

GO


The stored procedure above calls DBCC OPENTRAN to give you information on any open transactions, and is optional.  It's more of a nice-to-have to see what's still ongoing in the database.  


The real meat of the stored procedure is grabbing the internal FileId of your database's log and then issuing a DBCC SHRINKFILE against it, essentially performing the same task as the Tasks --> Shrink --> Files on SSMS -- the code above does not specify a target size, so it will try to shrink to your transaction log's configured default size.


Note that, if your database is on a Full Recovery model, this sproc may not free up all the disk space that you'd want (with DBCC SHRINKFILE potentially coming back with a message saying it couldn't free up all the space requested).  This is because the Full Recovery model, by design, requires that you have regular backups of your logs.  Thus, it will keep all transaction logs in the virtual logs until they are backed up.  The act of backing up your transaction log will truncate it and free up even more space.  So, we can modify the above stored procedure to take this into account:


CREATE procedure [dbo].[usp_BackupAndShrinkTransactionLog]
as

Checkpoint
DBCC opentran(SampleDatabase)
BACKUP LOG SampleDatabase

DECLARE @wk_fileid INT
SELECT @wk_fileid = fileid  
FROM sysfiles
WHERE [name] = 'SampleDatabase_log'

DBCC SHRINKFILE (@wk_fileid)

GO


Note that the stored procedure code above assumes that you or your DBA have configured a default Backup Destination for your database.  You can modify the BACKUP LOG statement to dump the transaction log to a specific named Backup Device by adding " TO SampleBackupDevice_Log1" (BACKUP LOG(Sample Database) TO SampleBackupDevice_Log1)


Using either of these stored procedures as part of your T-SQL routines will help in keeping your transaction log sizes manageable.  Of course, do consult with your DBA's if you end up moving your test database to a production environment, as I'm sure they will wonder why the log files backups happen more frequently than they designed.  :-)


Additional info from MSDN: Shrinking the Transaction Log

Thursday, September 22, 2011

Using Synergy Over a Non-Split Tunnel VPN

As with any good geek, I have multiple computers running at home. Going by the "DRY" (Don't Repeat Yourself) principle of software development, I preferred not to use multiple sets of keyboards and mice to control these machines. Enter Synergy.

Synergy is a clever piece of open source software. It uses the basic client-server paradigm to allow you to share one computer's keyboard and mouse across multiple computers over the network. The idea here is that the "server" computer has the keyboard and mouse physically connected to it, and the "client" machines simply connect to the "server" to get access to its keyboard and mouse. Simple, yet elegant.

I wanted to take this concept a step further. For my work machine, we have a non-split tunnel VPN. In lay terms, this means that, when I initiate a VPN connection to my work network, I lose all local connectivity on my laptop. In other words, my work laptop is no longer considered "local" to the other computers on my LAN. This is a bummer, because now I WOULDN'T be able to share my one set of keyboard/mouse between my personal laptop and my work laptop.

Through some port forwarding trickery, I was able to get Synergy to run on my personal laptop as well as on my work laptop whilst I was on VPN. How did I achieve this?

  1. Establish a port forwarding rule from my router to my local machine for HTTP (which is TCP Port 80). In other words, if a machine from outside my network browses to the WAN address of my router from a web browser, it will redirect that traffic to my local machine.
  2. Configure the Synergy "server" on my personal machine to run on Port 80.
  3. (Optional) If you have IIS running, set your Default Website to run on another port (say, 81) or just stop it outright.
  4. On the client machine (my work laptop while on VPN), the host name is the WAN address of my router. Go to Advanced Options and set the port to 80.
  5. Start the Synergy server on my personal laptop. Start the Synergy client on my work laptop while on VPN. Presto.
So, let me explain my approach above. By default, Synergy runs on TCP port 24800, which is all fine and good for my local network (I can do whatever I please with regards to my router firewall, port forwarding, etc). However, that is not kosher for my work's firewall. In fact, my work's firewall blocks all outgoing traffic to "non-common" ports...we're extra stingy at my work, so the only "common" port defined is HTTP (port 80), since, well, that's kind of the backbone of the Internet, and they don't want to block out all Internet traffic.

TCP Port 80 is the only non-blocked TCP port I could use to connect Synergy from my work network (via non-split tunnel VPN) to my personal network, hence the setup above.

Of course, this little setup only works if it's not vital for you to actually publish web content on Port 80 for your local network...it personally isn't for me (that's what my web hosts are for!). If your work network's firewall rules are less stingy than mine, you can of course apply the same approach to any TCP port that isn't blocked.

Now, my only concern is that they don't outright block my local network's IP. I haven't WireSharked Synergy so I don't know how verbose the language is when publishing out the X and Y coordinates of your mouse, as well as action buttons from the mouse or keyboard (I can't imagine it to be TOO verbose), so hopefully it will not generate an exorbitant amount of traffic to warrant blocking.

Friday, March 18, 2011

Breaking Down Database Query Results in Chunks of 65536 Rows for Excel 2003 Using Office Interop and VB .Net (Memory Efficient Edition)

So, my prior post had a solution for breaking down database query results from Excel into Worksheets of 65536 rows each. As previously mentioned in the post, it is a memory hog, and probably will blow through your assembly's allocation. I came up with a more memory-friendly solution to this issue. In essence, in this approach, I swap out memory processing for I/O processing.

This solution takes the following approach:
  1. Dump out contents of database query to Excel 2003 XML format using a SqlDataReader, broken by x rows (65536 rows in this case, in the spirit of Excel 2003's limitations)
  2. Open said Excel 2003 XML file using Excel Interop
  3. Do any post-processing formatting and niceties to your spreadsheet
  4. Save the file as a normal Excel Workbook
  5. Delete the source XML file (which does indeed get gigantic)
The result is a process which consumes, at most, 40-50 MB in memory (as opposed to the hundreds of MB in the prior approach) with performance close to the prior approach's in-memory + Excel Interop approach.

Code follows.

Excel XML Header (as referenced in the code below)



<?xml version="1.0"?>
<?mso-application progid="Excel.Sheet"?>
<Workbook xmlns="urn:schemas-microsoft-com:office:spreadsheet" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:x="urn:schemas-microsoft-com:office:excel" xmlns:ss="urn:schemas-microsoft-com:office:spreadsheet" xmlns:html="http://www.w3.org/TR/REC-html40">
<DocumentProperties xmlns="urn:schemas-microsoft-com:office:office">
<Author>enograles</Author>
<LastAuthor>enograles</LastAuthor>
<Created>2011-03-18T16:24:37Z</Created>
<Company></Company>
<Version>12.00</Version>
</DocumentProperties>
<ExcelWorkbook xmlns="urn:schemas-microsoft-com:office:excel">
<WindowHeight>11895</WindowHeight>
<WindowWidth>19020</WindowWidth>
<WindowTopX>120</WindowTopX>
<WindowTopY>105</WindowTopY>
<ProtectStructure>False</ProtectStructure>
<ProtectWindows>False</ProtectWindows>
</ExcelWorkbook>
<Styles>
<Style ss:ID="sDate">
<NumberFormat ss:Format="Short Date"/>
</Style>
</Styles>



Implementation Code



Public Shared Function ExportToExcelXML(ByVal conn As SqlConnection, ByVal sql As String, _
Optional ByVal path As String = Nothing) As String

Dim excel As Application

Try
Dim destination As System.IO.DirectoryInfo
Dim workbookFullPathSource As String
Dim dtResult As New System.Data.DataTable
Dim textWriter As System.IO.TextWriter
Dim pageCount As Integer = 1
Dim rowCount As Integer = 0
Dim worksheetWritten As Boolean

' First grab the result set
Dim cmdResult As New SqlCommand(sql, conn)
Dim rdr As SqlDataReader = cmdResult.ExecuteReader

' Generate a new Guid for the Workbook name.
Dim workbook_name As String = System.Guid.NewGuid().ToString()

' First validate the destination
If String.IsNullOrEmpty(path) = False Then
If System.IO.Directory.Exists(path) Then
destination = New System.IO.DirectoryInfo(path)
Else
' Attempt to create it
destination = System.IO.Directory.CreateDirectory(path)
End If
Else
' Just drop it in the executing assembly's root folder if no path is specified
destination = New System.IO.DirectoryInfo(System.Reflection.Assembly.GetExecutingAssembly.Location.Substring(0, System.Reflection.Assembly.GetExecutingAssembly.Location.LastIndexOf("\")))

End If

' Construct the full path
workbookFullPathSource = destination.FullName & "\" & workbook_name & ".xml"

' Instantiate the writer
textWriter = New System.IO.StreamWriter(workbookFullPathSource, False)

' Write the header
WriteExcelXMLHeader(textWriter)

' Iterate through the results
If rdr.HasRows Then
While rdr.Read

' Iterate the sheet if we've reached the limit
If rowCount = 65536 Then
worksheetWritten = False
pageCount += 1
rowCount = 0 ' Reset the counter to 0
End If

' Define a sheet and write the headers
If worksheetWritten = False Then
If pageCount <> 1 Then
textWriter.WriteLine(" </Table>")
textWriter.WriteLine(" </Worksheet>")
End If

textWriter.WriteLine(" <Worksheet ss:Name=""Page " & pageCount & """>")
textWriter.WriteLine(" <Table ss:ExpandedColumnCount=""" & rdr.FieldCount & """>")
textWriter.WriteLine(" <Row ss:AutoFitHeight=""0"">")

' The headers
For i As Integer = 0 To rdr.FieldCount - 1
textWriter.WriteLine(" <Cell><Data ss:Type=""String"">" & rdr.GetName(i) & "</Data></Cell>")
Next

textWriter.WriteLine(" </Row>")

' Yes, the header counts as a row
worksheetWritten = True
rowCount += 1
End If

' Write the actual data
textWriter.WriteLine(" <Row ss:AutoFitHeight=""0"">")
For i As Integer = 0 To rdr.FieldCount - 1
Dim dataType As String
Dim dataContents As String = rdr.Item(i).ToString()

If TypeOf (rdr.Item(i)) Is String Then
dataType = "String"
ElseIf TypeOf (rdr.Item(i)) Is DateTime Then
dataType = "String"
dataContents = CType(rdr.Item(i), DateTime).ToString("MM/dd/yyyy")
ElseIf IsNumeric(rdr.Item(i)) Then
dataType = "Number"
Else
dataType = "String"
End If

' The data with the proper type
textWriter.WriteLine(" <Cell><Data ss:Type=""" & dataType & """>" & dataContents & "</Data></Cell>")
Next

' Terminate the row
textWriter.WriteLine(" </Row>")

' Iterate row counter
rowCount += 1
End While
Else
textWriter.WriteLine(" <Worksheet ss:Name=""Page " & pageCount & """>")
textWriter.WriteLine(" <Table ss:ExpandedColumnCount=""" & rdr.FieldCount & """>")
textWriter.WriteLine(" <Row ss:AutoFitHeight=""0"">")
textWriter.WriteLine(" <Cell><Data ss:Type=""String"">No Results Found from Query:" & cmdResult.CommandText & "</Data></Cell>")
textWriter.WriteLine(" </Row>")
End If

' Close out the writer
textWriter.WriteLine(" </Table>")
textWriter.WriteLine(" </Worksheet>")
textWriter.WriteLine("</Workbook>")
textWriter.Close()

' Process in Excel Interop for formatting and saving to a proper Excel document
excel = New Application()
excel.DisplayAlerts = False

Dim wbSource As Workbook = excel.Workbooks.Open(workbookFullPathSource)

' For all sheets, autofit columns, bold first rows, freeze panes on all worksheets
For i As Integer = 1 To wbSource.Worksheets.Count
Dim ws As Worksheet = CType(wbSource.Worksheets(i), Worksheet)
ws.Select()
CType(ws.Cells(1, 1), Range).EntireRow.Font.Bold = True
CType(ws.Cells(2, 1), Range).Select()
excel.ActiveWindow.FreezePanes = True
ws.Columns.AutoFit()
Next

' Select the first Worksheet
CType(wbSource.Worksheets(1), Worksheet).Select()

' Save as a workbook, exit out of Excel
wbSource.SaveAs(destination.FullName & "\" & workbook_name & ".xls", FileFormat:=Microsoft.Office.Interop.Excel.XlFileFormat.xlWorkbookNormal)
wbSource.Close()
excel.Quit()

' Delete the XML source
'System.IO.File.Delete(workbookFullPathSource)

Return workbook_name & ".xls" 'workbook_full_path

Catch ex As Exception
Throw ex
Finally
If excel Is Nothing = False Then
excel.Quit()
System.Runtime.InteropServices.Marshal.ReleaseComObject(excel)
excel = Nothing
GC.Collect()
End If
End Try
End Function

''' <summary>
''' This routine will write an Excel XML header before defining worksheets
''' </summary>
''' <param name="textWriter"></param>
''' <remarks></remarks>
Private Shared Sub WriteExcelXMLHeader(ByVal textWriter As System.IO.TextWriter)
textWriter.WriteLine(My.Resources.resMain.ExcelXMLHeader)
End Sub


Thursday, March 10, 2011

Breaking Down Database Query Results in Chunks of 65536 Rows for Excel 2003 Using Office Interop and VB .Net

March 10, 2011. Still hard to believe the date. Even harder to believe is how enterprises hold onto older versions of MS Office -- in our case, MS Office 2003. Yes, Excel 2007/2010 and the Ribbon UI brings a bit of a learning curve, but having worked extensively with 2003 and 2007/2010, I can honestly say that I am faster and way more productive on the new Ribbon UI. I have drank the Kool-Aid, I love the newer versions of Office.

The new version of Office give us some niceties not afforded to the 2003 version. Some neat niceties (the Excel file being a glorified Zip file, with XML standards), and some practical niceties... in the case of the latter, the elimination of that pesky 65536 row limitation. Alas, with my enterprise still on 2003, it is a limitation we developers need to live with. We have various utilities in .Net that export results out to Excel spreadsheets, because let's face it, as an ad-hoc BI tool, Excel is pretty darn good at analyzing volumes of data.

With that in mind, when you have a requirement for large amounts of data (think in the hundreds of thousands of rows) to be published out to an Excel 2003 Workbook, it poses a little bit of a challenge. Specifically, how do you break up your results into Pages of Worksheets (65536 rows in each page) that the end-users can manipulate to their hearts' contentment? A couple of issues you run into: (a) How to break down your results in chunks of 65536 rows and (b) How to insert them blocks at a time without blowing through the Range.Value2's memory limitation (which, as far as I can tell, is not even published)?

I came up with a routine that facilitates this functionality. Just a fair word of warning, it's a bit of a hack, using standard ADO .Net objects and clever pasting of arrays to the Range.Value2 property of the Excel Interop Model while gratuitously calling the Garbage Collector so we don't blow through our memory allocation (because it uses up a bunch to begin with in persisting the results to memory as an ADO .Net DataTable).


Not my neatest of routines, but it gets the job done. An obvious improvement would be to break down the work in multiple subroutines. Another improvement could be the utilization of memory -- I've only tested this routine for the upper bounds of our data requirements (about 900,000 points of data?), so for larger data requirements, you may need to tune it a bit so it will not blow through the assembly's memory allocation. Of course, if you come up with a clever way to clean the code up a bit, I'd be happy to hear about it!

Again, I reiterate my HACK ALERT statement:


  
Public Shared Function ExportToExcel(ByVal conn As SqlConnection, ByVal sql As String, _
Optional ByVal path As String = Nothing) As String

Dim excel As Application


Try
Dim destination As System.IO.DirectoryInfo
Dim workbook_full_path As String
Dim dtResult As New System.Data.DataTable
Dim lstSheets As New Generic.List(Of String)

' First grab the result set
Dim cmdResult As New SqlCommand(sql, conn)
Dim rdr As SqlDataReader = cmdResult.ExecuteReader
dtResult.Load(rdr)

' How many sheets will we need?
Dim sheet_count As Integer = SheetCount(dtResult.Rows.Count)

' Generate a new Guid for the Workbook name.
Dim workbook_name As String = System.Guid.NewGuid().ToString()

' First validate the destination
If String.IsNullOrEmpty(path) = False Then
If System.IO.Directory.Exists(path) Then
destination = New System.IO.DirectoryInfo(path)
Else
' Attempt to create it
destination = System.IO.Directory.CreateDirectory(path)
End If
Else
' Just drop it in the executing assembly's root folder if no path is specified
destination = New System.IO.DirectoryInfo(System.Reflection.Assembly.GetExecutingAssembly.Location.Substring(0, System.Reflection.Assembly.GetExecutingAssembly.Location.LastIndexOf("\")))

End If

' Construct the full path
workbook_full_path = destination.FullName & "\" & workbook_name & ".xls"

' Create Excel objects
excel = New Application
excel.DisplayAlerts = False
Dim wb As Workbook = excel.Workbooks.Add()
Dim results_processed As Integer = 0

' Tracking dictionary so we can order the sheets
Dim dicSheets As New Generic.Dictionary(Of Integer, Worksheet)

' Only do the Excel processing if we have results
' Create the right number of sheets
' And drop data into each sheet
If sheet_count > 0 Then
For i As Integer = 1 To sheet_count

' Create a sheet
Dim ws As Worksheet

' Put in order
If i > 1 Then
ws = wb.Worksheets.Add(Type.Missing, dicSheets(i - 1))
Else
ws = wb.Worksheets.Add
End If

dicSheets.Add(i, ws)

Dim results_outstanding As Integer = dtResult.Rows.Count - results_processed

ws.Name = "Page " & i.ToString()
lstSheets.Add(ws.Name)

' Start and end positions of the DataTable. Conditions for if we have
' more than 65536 results, we need to know the breakdown for where we
' should start and end (ordinally) on the table based on what Sheet we're on
Dim dt_start_row As Integer
Dim dt_end_row As Integer
Dim sheet_end_row As Integer

' Data Table Bounds
' Positionally on the DataTable, where are we extracting data?
If i > 1 Then
dt_start_row = (i - 1) * 65536 ' Minus one because the first sheet is rows 1-65536
Else
dt_start_row = 1
End If

If results_outstanding < 65536 Then
dt_end_row = dtResult.Rows.Count
sheet_end_row = results_outstanding + 1 ' We need it to be inclusive
Else
dt_end_row = i * 65535
sheet_end_row = 65535
End If

' Create a two dimensional array
' First dimension = rows
' Second dimension = columns
' Add + 1 to rows because the first row is always the column headers
Dim result(sheet_end_row + 1, dtResult.Columns.Count) As Object

' Publish the row header
For col As Integer = 0 To dtResult.Columns.Count - 1
result(0, col) = dtResult.Columns(col).ColumnName
Next

Dim sheet_row_count As Integer = 1

' Propagate the data down in the array
' subtract one since the DataTable is zero based
For result_data_row As Integer = dt_start_row - 1 To dt_end_row - 1

For result_data_col As Integer = 0 To dtResult.Columns.Count - 1

Dim value As Object = dtResult.Rows(result_data_row)(result_data_col)

' DateTimes come up weird on Excel
If TypeOf (value) Is DateTime Then
result(sheet_row_count, result_data_col) = value.ToString() 'dtResult.Rows(result_data_row)(result_data_col).ToString()
Else
result(sheet_row_count, result_data_col) = value 'dtResult.Rows(result_data_row)(result_data_col)
End If
Next

' Iterate the sheet's row counter (not the data table's row counter)
sheet_row_count += 1
Next

' Arbitrary number of rows to drop at a time so Excel's rng.Value2 doesn't exception out. Needs further refining, obviously
If sheet_row_count >= 65535 AndAlso dtResult.Columns.Count > 30 Then

' Drop in Chunks of 700 rows at a time
' We can change this later as we performance tune
Dim row_chunks As Integer = 700
Dim excel_sheet_max As Integer = 65536

For array_row As Integer = 0 To excel_sheet_max - 1 Step (row_chunks - 1)

Dim number_of_rows As Integer

' If we are at the very end, specify as such
' as it might not be a full 700 rows
If array_row + (row_chunks - 1) > excel_sheet_max Then
number_of_rows = excel_sheet_max - array_row
Else
number_of_rows = row_chunks
End If


Dim result_segment(number_of_rows, dtResult.Columns.Count) As Object
Dim rngBegin As Range = ws.Cells(array_row + 1, 1)
Dim rngEnd As Range = ws.Cells(array_row + number_of_rows, dtResult.Columns.Count)
Dim rngAll As Range = ws.Range(rngBegin.Address & ":" & rngEnd.Address)

' Copy the 700 rows of elements to the segment
Try
Array.Copy(result, array_row * (dtResult.Columns.Count + 1), result_segment, 0, number_of_rows * (dtResult.Columns.Count + 1))
rngAll.Value2 = result_segment
Catch e As Exception

End Try

Array.Clear(result_segment, result_segment.GetLowerBound(0), result_segment.Length)
Array.Clear(result_segment, result_segment.GetLowerBound(1), result_segment.Length)
result_segment = Nothing
Next

Else
Dim rngBegin As Range = ws.Cells(1, 1)
Dim rngEnd As Range = ws.Cells(sheet_end_row + 1, dtResult.Columns.Count)
Dim rngAll As Range = ws.Range(rngBegin.Address & ":" & rngEnd.Address)

rngAll.Value2 = result
End If



' Formatting -- bold and freeze header columns, autofit columns
CType(ws.Cells(1, 1), Range).EntireRow.Font.Bold = True
CType(ws.Cells(2, 1), Range).Select()
excel.ActiveWindow.FreezePanes = True
ws.Columns.AutoFit()

' Blank out the array
Array.Clear(result, result.GetLowerBound(0), result.Length)
Array.Clear(result, result.GetLowerBound(1), result.Length)
result = Nothing
GC.Collect()

' Keep track of what we've processed
results_processed += sheet_end_row
Next


' Cleanse sheets - remove Worksheets that aren't named "Page x"
Dim lstRemoveSheets As New Generic.List(Of String)
For Each objWS As Object In wb.Worksheets
Dim ws As Worksheet = CType(objWS, Worksheet)
If lstSheets.Contains(ws.Name) = False Then
lstRemoveSheets.Add(ws.Name)
End If
Next

For Each sheet_to_delete As String In lstRemoveSheets
CType(wb.Worksheets(sheet_to_delete), Worksheet).Delete()
Next
End If

' Select the first cell of the sheet

Dim wsFirst As Worksheet '= dicSheets(1)

' Indicate we received no results
If sheet_count = 0 Then
wsFirst = wb.Worksheets(1)
CType(wsFirst.Cells(1, 1), Range).Value2 = "No results found."
Else
wsFirst = dicSheets(1)
End If

wsFirst.Select()
CType(wsFirst.Cells(1, 1), Range).Select()

' Finally return the workbook name
wb.SaveAs(workbook_full_path)
wb.Close()

' Dispose the dtResult
dtResult.Dispose()
dtResult = Nothing

Return workbook_name & ".xls" 'workbook_full_path

Catch ex As Exception
Throw New Exception("Error encountered when exporting to Excel: " & ex.Message & " " & ex.StackTrace)
'Return Nothing ' A nothing returned implies unsuccessful creation
Finally
If excel Is Nothing = False Then
excel.Quit()
System.Runtime.InteropServices.Marshal.ReleaseComObject(excel)
excel = Nothing
GC.Collect()
End If
End Try
End Function

Saturday, February 26, 2011

Greekstentialism


Kalisera from Athens, Greece! Another day, another data integration project with one of our field offices...and thankfully, this one happens to be in one of the cities I've been meaning to visit since I was a child. Growing up a product of the US public school system, you'd often hear stories about ancient Greek civilizations and their culture. This country is indeed the birthplace of democracy and great minds such as Socrates, and of course, being a general math geek that I am, Pythagoras. Heck, ancient Greek reading was even required in high school, as we read Homer's Iliad and Odyssey in our English classes (the latter being one of my favorite books of all time). All things considered, despite having fallen on hard times recently, I believe Greece to still be one of the most fascinating countries in the world. The picture to the left is me seated out front of the Parthenon in the Acropolis, a truly amazing structure that I'd rank up with the Egyptian pyramids as a "must see before time wears it down to nothing."

Anyway, geeky tourist fascination aside, I've learned much about the modern Greek culture since I've been here. Some tips for my fellow Americans who may have never visited Greece before and are planning on coming by in the near future:
  • The Greeks LOVE cheese. Not just feta either. Cheese is practically on every dish. For those who are lactose-intolerant, bring all appropriate OTC treatments.
  • Eat lunch at 12-1 pm? Most Greeks will look at you strangely. Lunch here typically begins around 2:30pm/3:00pm.
  • Eat dinner at 6-7 pm? Again, most Greeks will look at you strangely. Actually, you won't see them looking at you strangely because none of them will be out. Dinner starts around 9:00pm/9:30pm.
  • Common Greek phrases can be found here complete with phonetic pronunciation.
  • Want to eat some amazing seafood while driving by some of the biggest villas in Greece? Take a trip a little north of Athens to a place called Kiffisias.
  • Have ouzo on the rocks when having seafood, then finish off with mastica as a dessert apertif.
Word is I might be headed back around these parts sooner rather than later, so I will be sure to post any more tips as I discover them.

Sunday, February 13, 2011

Environmental Geekstentialism

With petrol prices still climbing and a need for a highly efficient car for my fiance, I decided it was time to finally bid adieu to my Honda S2000. I had quite a terrific time with that car, but its impracticality wore my patience a little thin over the years. I would still recommend it for anyone who wants a reasonably priced convertible sports car without sacrificing the fun -- I dare say you won't find a better 6-speed shifter this side of a Porsche 911. I'll still have my 2003 Subaru WRX project car to keep my speed needs at bay.

That being said, as a result of our vehicular needs, we have joined the ranks of the hybrid owners of the world and got ourselves a 2010 Honda Insight. I must admit, I was a little leery with hybrid technology at first; the first generation Insight was the S2000's environmental cousin (i.e. completely impractical with anything else aside from fuel efficiency) and the prior and current gen Priuses never really did it for me with regards to a driving experience...not to mention its steadily climbing price.

With that in mind, I was able to hunt down a certified 2010 Honda Insight that struck our fancy with its price point. In test driving the car, I found myself to be a bit surprised -- the handling of the car is, dare I say it, actually fun. Its handling characteristics reminded me of my cousin's Honda Fit, with which I'm sure the Insight shared some of its components. The steering is very responsive, definitely much more responsive than the Priuses I've test driven, and it feels pretty nimble on its feet in normal and highway traffic.

Don't get me wrong, speed and acceleration-wise, the Insight is the polar opposite of the S2000, but my fiance was more concerned with reliability and fuel efficiency. In both aspects, I'd say Honda excels. Reliability? It's a Honda. I know it's become the company's unofficial motto, but their cars simply are highly reliable. I still remember my parents' old 1989 Civic. My family piled on about 200,000 miles and 15 years on that car. Engine never blew, only had to replace the usual wear and tear items, and overall was a great little car. Ditto my S2000 -- not a problem with it mechanically in my 3 years of ownership (and, um, enthusiastic driving).

Fuel efficiency...now this is where the Insight really impresses the geek in me. Most of the new hybrids have a similar system, but I believe the Insight's execution of it is a little better on the user-side. The car has a built in ECO Mode which maximizes the car's fuel efficiency to go along with the brilliant ECO Guide user interface on the dash. The ECO Guide is what I'm impressed with the most. The speedometer (mounted at the top of the dash, a la the new Civics) has an ambient light that glows green when you're driving in a fuel efficient manner and dark blue when you're driving like a maniac. To help coach you into better fuel efficiency, there is also a UI on the dash called the Eco Drive Bar which shows you efficiency ranges as you accelerate and decelerate. I think this system, at least for me, works a bit better than Ford's pretty "leaves" UI. The ambient lighting on the dash gives enough of a high level overview of your fuel efficiency while the Eco Drive Bar can show a driver specifically how his inputs affect his fuel efficiency. So, even a lead footed speed addict such as myself can be trained into maximizing his fuel efficiency. Not bad at all.

So, what's the verdict on the fuel efficiency? The sticker says 40 city/43 highway. I drove from the hills of Manayunk to Abington, taking all local hilly roads, and I easily beat the city average and got closer to 45 mpg. I'd imagine that reaching 50+ mpg on the highway is easily attained. The Prius still beats those numbers by about 10-20% with its slightly more advanced hybrid system, but again, I felt the driving experience on the Insight was slightly more fun than that of its Toyota competitor...not to mention I'm kind of biased to Honda products.

While the car enthusiast in me may mourn the loss of one of the best production sports cars ever built, the Eco Geek in me will rejoice in (maybe) once a month fill-ups, lower insurance premiums, and a more positive environment impact. That will certainly help.

...and I think my WRX is about due for a new turbo and intercooler anyway ;-)