Warning: Declaration of description_walker::start_el(&$output, $item, $depth, $args) should be compatible with Walker_Nav_Menu::start_el(&$output, $item, $depth = 0, $args = Array, $id = 0) in /home1/mapbutch/public_html/blog/wp-content/themes/SCRN/functions.php on line 264

Warning: Cannot modify header information - headers already sent by (output started at /home1/mapbutch/public_html/blog/wp-content/themes/SCRN/functions.php:264) in /home1/mapbutch/public_html/blog/wp-includes/feed-rss2.php on line 8
map butcher http://mapbutcher.com/blog Tue, 06 May 2014 11:21:00 +0000 en-US hourly 1 http://wordpress.org/?v=3.9.19 Falling in love with maps…… again http://mapbutcher.com/blog/2014/05/06/falling-in-love-with-maps-again/ http://mapbutcher.com/blog/2014/05/06/falling-in-love-with-maps-again/#comments Tue, 06 May 2014 11:19:42 +0000 http://mapbutcher.com/blog/?p=961 A wee while ago I lost my map mojo. I’m not sure where it went.

The work I do is so often framed as a GIS or spatial problem, but more often than not the end solution is built around the adoption of other (non-spatial) technology and changes to processes and the way people work. In amongst the complexity I felt the spatial stuff was the simple stuff and my map mojo ebbed away.

At Easter I found myself back in New Zealand after a number of years away. My family had a break on the wonderful Waiheke. When we arrived on Waiheke it seemed all very familiar. As I navigated the winding, lush, fern laden roads a flood of memories came back to me. Even the names of the streets felt comfortable. Somewhere in my memory I’d been here before.

IMAG1329

Over six years ago we held (AFAIK) the first ever OpenStreetMap mapping party in NZ. I remember how blown away I was with the Openstreetmap project in its infancy – it was going to be big. The democratisation of map making. On that beautiful Saturday I drove those streets on Waiheke, recording the waypoints and street names. It was this act of map making that left those memories with me. I loved being able to re discover these places and even find the traces of my memories:

Mapbutcher changesets

Take a moment and make a map the next time you travel. You may even find your map mojo.

 

 

]]>
http://mapbutcher.com/blog/2014/05/06/falling-in-love-with-maps-again/feed/ 0
Map Design Walls http://mapbutcher.com/blog/2013/04/04/map-design-walls/ http://mapbutcher.com/blog/2013/04/04/map-design-walls/#comments Thu, 04 Apr 2013 07:55:17 +0000 http://mapbutcher.com/blog/?p=925 I’m currently working on a large project where many of the team members have very little experience with mapping or GIS technology. It’s interesting to hear their views on how the ‘geospatial stuff’ can help create a better solution for our users and it’s a real eye opener for me to look at my own little world from the outside in. Anyway, to get people thinking about maps and how we can use maps online in our solution I’ve created a Map Design Wall.

Design walls are used a lot by design teams as spaces to gather various ideas and concepts together ranging from low fidelity wire frames and mock ups through to examples of products and brands which provide inspiration. At a practical level a visible design wall is a great way of collecting data together in a single place where we can gather the team and discuss what works and what doesn’t. It’s useful to scrawl on your designs or put stickies on your images to highlight what you like or what you don’t and why. You can also collect ideas into sets, such as different legends, different map page layouts, different ways of enabling users to work with layers, different ways to allow users to work with temporal data etc.

Start your map design wall today.

]]>
http://mapbutcher.com/blog/2013/04/04/map-design-walls/feed/ 0
Mapping…….user scenarios http://mapbutcher.com/blog/2013/04/02/mapping-user-scenarios/ http://mapbutcher.com/blog/2013/04/02/mapping-user-scenarios/#comments Tue, 02 Apr 2013 08:31:38 +0000 http://mapbutcher.com/blog/?p=920 Too many web mapping applications are built on the back of requirements documents which are entirely misguided and miles away from any real user need. In the case of web mapping applications developed in traditional GIS organisations (utilities, government, engineering etc) this  ‘classic’ approach to servicing user needs is often lead by the GIS ‘Professionals’ who in my experience are unfortunately the most ill equipped set of individuals to do such a job. Where user requirements are specified in this manner and GIS ‘Professionals’ act as the user proxy disaster inevitably awaits. If you have to ‘specify requirements’ for tendering I’d encourage you to attempt to focus on expressing the intended value your users will gain from a particular feature of the mapping system. For example instead of specifying a requirement like:

The user should be able to select features on the map and query other layers based upon their selection.

………………you could try and suggest the value of such a feature to a user:

 A user needs to be able to know what overlays exist against a planning application so they can quickly and easily inform the interested party of their obligations.

By expressing the requirement (or user story…if you like) in terms of how a real user will value the feature, it provides us with a much richer insight into the system’s purpose. This brings me around to the subject of this post. How do we work with requirements once they are expressed in this way? This week I worked with a team who used a technique called User Scenario Mapping. The team took a series of user stories and began to work with them to help us understand how a user could work with a system to achieve their goal. It’s a very effective technique and I encourage you to give it a shot. Let’s use the example above and see how it works.

Give the story context

I like attaching real people to user stories. I find it helps me attach a piece of value I’m working on with a person who will use it. It moves a feature from being abstract to concrete. As well as the who, there are the other obvious aspects like the what, where, why and how often which can be useful – our user story above is pretty close, but we’ll make a couple of changes:

Jim from the planning section at Bawbag Council needs to be able to know what overlays exist against a planning application so he can quickly and easily inform Barry the builder of their obligations.

What steps will Jim take?

We have our overall story, but now we must walk through what we think Jim should do to achieve his goal. The most effective way to do this is to place a post it note on a wall for each step Jim will take. Under each post it note you will record any comments, questions, assumptions or ideas that you have regarding each step. It’s a good idea to record these things on different coloured post it notes. Lets take the first few steps as an example:

  • Step 1 – Jim searches for Barry’s planning application on the map page
    • Question: What information does Jim have to help him find Barry’s planning application?
    • Question: Does Jim need to search on the map page?
    • Notes: Jim does this every couple of days for different planning applications
  • Step 2. Jim can view the extent of the planning application on the map
    • Questions: How should the planning application boundary appear – should it be labelled?
  • Step 3. Jim can see the summary of the overlay information effecting the planning application
    • Note: Summary information should contain overlay description, type and effected area in meters
    • Note: Jim should be able to see further details for the overlay if required

…you get the picture. We don’t assume any solution at this point, but we’re trying to establish the logical steps a user like Jim would go through to meet their goal. When you’ve completed these steps you end up with something like this:

You then repeat the process for each user scenario you want to map. Don’t be confronted if you have a lot of user stories. This process can be done incrementally and it’s time well spent if the outcome is a system which is effective.

By working through a user scenario in this way I found that it helps identify the typical assumptions that we make when designing systems. Another useful outcome is that it helps you attach value to each step a user will undertake. Taking our mock example above, Jim simply wants to find out as quickly as possible the answer Barry needs i.e. what overlays are effecting his application. If we take the requirement as its first stated and don’t do any user scenario mapping we are likely to arrive at some god awful spatial selection tool solution, however by spending a small amount of time using this technique we focus in on what’s important to our user and can dispense of any GIS baggage we may have. In the wise words of Plex ‘Try It You Might Like It’. For more info this is also a useful post

]]>
http://mapbutcher.com/blog/2013/04/02/mapping-user-scenarios/feed/ 2
Where the bloody hell are you? http://mapbutcher.com/blog/2012/12/20/where-the-bloody-hell-are-you/ http://mapbutcher.com/blog/2012/12/20/where-the-bloody-hell-are-you/#comments Wed, 19 Dec 2012 23:30:27 +0000 http://mapbutcher.com/blog/?p=908 It’s exactly a year since I’ve posted here….where the bloody hell have I been?

In mid 2011 I started to work with a small company just starting out called Geoplex. It’s a real challenge keeping a young small business going. As a company we looked at the way Geospatial services and technology was traditionally delivered into organisations and we felt it could be done better. Our customers are demanding more – and so they should. They’re seeking new ideas about how to apply spatial technology and are looking for more cost effective models of delivery. It’s a really exciting space to be in and its great to be able to say that Geoplex has been welcomed with open arms into the Australian market. A big big thank you to all our customers for making it a great 2012.

It’s been a truly crazy year. We’ve delivered projects ranging from augmented reality native mobile applications to web based mapping solutions. Within enterprises we’re working with teams who are embracing change and looking for new ways to deliver value, we’re helping them build capability, and helping them to inter-operate more effectively with their partners. The new year brings new clients with some exciting and ambitious projects. I’m really glad to be working in a small team of talented individuals who all enjoy practising their craft on a daily basis.

As far as what 2013 has in store? We’re talking more and more to our customers about using the web as the backbone for data and service provision rather than building stand alone applications – which is great – I hope to see some projects come off the ground next year in this space. Our customers are increasingly moving away from solely relying on single vendor proprietary technology and as a development team we’re increasingly using open source technology and cloud based services whether that be platform, infrastructure or software, so I think we’ll see that trend continue.

To cut the story short during 2012 I’ve been head down, and arse up and I’m sorry if you were expecting Lara Bingle.

Ho Ho Ho.

S

 

]]>
http://mapbutcher.com/blog/2012/12/20/where-the-bloody-hell-are-you/feed/ 0
Subject Matter Experts http://mapbutcher.com/blog/2011/12/19/subject-matter-experts/ http://mapbutcher.com/blog/2011/12/19/subject-matter-experts/#comments Mon, 19 Dec 2011 12:18:25 +0000 http://mapbutcher.com/blog/?p=820

Sometimes on a project your team will include a Subject Matter Expert (SME). I think the general idea of these individuals is that they provide deep domain experience to members of the project team who may not really understand the space they’re working in very well – makes perfect sense. A couple of examples where I have seen the SME role used is implementing GIS within utilities and local government. In both contexts the SME provides an in depth knowledge of their respective domain, i.e. a water network, or perhaps a government property model. They may provide expert advice on entity modelling and relationships, provide real world data examples and scenarios and of course act as a natural liaison with other parts of an organisation you may need to dig into.

There are a few things however that concern me about how the SME role is often used:

1. Authority

The SME role can exercise a great deal of authority. Executive sponsorship will often cede responsibility to an SME irrespective of the nature of the calls they need to make. SME’s can also be viewed or boxed into the de-facto representative for ‘the business’. In a Scrum environment I would caution the use of an SME  as a replacement for the product owner role. Whilst they provide invaluable experience, they are not necessarily positioned or motivated to assess value or make calls on priorities.

2. They aren’t users

In my experience an SME role is often given to someone who has a great deal of experience. However that experience is associated with a perspective based upon a deep understanding of detail which may not translate well to the goals of the project. Take our utility example, an SME can provide information on the logical and physical networks, helping you model data and understand the relationship and behaviour of the real world network, however that depth of knowledge does not necessarily translate to the needs of a customer service representative or an incident response operator. SME’s aren’t always users.

3. Edge Cases

I have found that SME’s can have an intimate knowledge of edge cases, whilst these are not invalid you must decide where these cases sit within the overall value you’re trying to deliver  – I’ve worked on projects that have literally ground to a complete standstill while edge cases are reasoned  - with the end result being an over complication of the finished product yielding little real value.

If you’re working on a project with an SME then it can be useful to keep these questions in the back of your mind.

 

 

]]>
http://mapbutcher.com/blog/2011/12/19/subject-matter-experts/feed/ 0
My Baggage http://mapbutcher.com/blog/2011/12/16/my-baggage/ http://mapbutcher.com/blog/2011/12/16/my-baggage/#comments Thu, 15 Dec 2011 22:07:02 +0000 http://mapbutcher.com/blog/?p=815 Over the past few months I’ve been making an effort to drop my Esri baggage. I’m trying to get all Hugh Fearnley- Whittingstall on GIS – I want to start feeding my GIS needs on shit I know the provenance of, and perhaps learn how to manage it all myself too.

I’d like to think that I’ve been an advocate for Open Source GIS software for a while however not at a grass roots level, I’ve never committed a line of code or doco to an open source project. My motives are driven by a number of things; Firstly a self interest in finding out more about alternative open approaches to software development, secondly I’ve never been completely convinced that the traditional software vendor model was the best way forward from a client perspective and thirdly I wanted to understand the choices my clients had when it came to spending their pennies on software. In 2006 I was reading more and more about open source GIS software, I attended the 2007 Foss4g and I was part of the 2009 Foss4g organising committee, but my day job has always been centred around the needs of clients with Esri software so I’ve never been in a position to scratch my own itch.

I have always been frustrated by the polar views on both sides of the proprietary\open source fence. It’s as annoying to me to listen to a dim witted software sales person drop FUD as it is to listen to some super *geek* who believes you’ve just committed a heinous crime by running windows.  I’ve not succumb to the pusherman either – I have always made conscious choices towards proprietary GIS software mainly because its been the source of my mortgage repayments. I still like writing software on top of Esri stuff, but I’ve felt for a long time that I’ve been going through the motions, when Esri move, I move, when Esri change their minds I follow and to be perfectly honest I’m tired of listening to the same record.

So I’ve taken some early steps in untying the apron strings and thought I would share some of my opinions.

1. People love security

I’ve always wondered why more people don’t shop around for mortgage deals every few years, like there’s some unwritten rule that when you take a mortgage with a bank you’re signed up for the 25 year term – people tend to like security, or what they think is security. In environments which are naturally safe then adoption to new technology is generally sluggish – people tend to want to use something they’ve used before or perhaps something they’ve seen their neighbours use. From my experience local government is a good example of this, I’ve found that geographically coincident governments tend to use the same GIS software, because they can easily share experience and trade war stories and scars about this or that vendor.

I have a security blanket too, we all do. There’s not many people I know who have an intimate knowledge of a large spectrum of GIS software. However I’ve found it very rewarding going for a few hours a day without my security blanket, I feel free. If I was starting a self help group I would perhaps be chanting ‘It’s OK not to open ArcMap, you’re still a good person’. In some case security comes in the form of a group of people who can support you or help you deliver and there is an emerging market of providers in this space who are wrapping open source software in a blanket just for you.

2. Glue

Esri sells software partly because their components fit together neatly like jigsaw pieces. I’m sure Esri will continue to develop this approach, making it as seamless an experience as possible to work between these components – its a sensible approach. They use language like ‘system’ and ‘platform’ because this is a key selling point for them, it described something larger, more unified. As each version rolls off the production line we’re told ‘stories’ about how easy it will all be in this release – this is an appealing story to many people, and very valid in some environments.

Open Source is challenged in this space – projects indepently emerge and don’t automatically need to belong to an umbrella group, who will help manage this story. To bridge the gap they rely on collaboration, standards and well used interchange formats – essentially knowing your way around these will help you to glue the open source projects together. I’m not trying to paint an unfair picture by calling this glue, it just seems appropriate to me. Some providers are selecting a set of well established tools and standards to base a more unified ‘system’ on – this makes perfect sense to me, this reads like the ‘new and improved’ Open Source story, and I believe it has value.

Personally I’m discovering a love for tools and technology I’ve never used before – it seems exciting to be out of my comfort zone – one thing which has really struck me is that whilst things do occasionally stumble (see no. 3) I have found it on the whole very straightforward. Even when barriers appear the availability of decent resources to allow you to move forward is abundent.

3. Open Source Crashes too

I don’t know how many times I’ve seen and read about ArcMap crashing! People love a good bitch and moan about Esri software, some GIS people just love a good bitch and moan. Open Source software crashes too – I’ve been using QGIS a bit lately, it crashes too, however this is my take on it; Firstly I’m fairly tolerant when it comes to it, because I don’t use it all the time. I just tend to restart and move on. Secondly if it bothered me enough there are a myriad of places to go and seek a fix or I could just try and fix it myself. This is of course the freedom I have when I use these products. If I had the time I should really do some work and provide back diagnostics to the projects and support a fix, but as I said these squeaky wheels are all acceptable to me. My guess would be that if the adoption of QGIS rocketed to the same levels are ArcGIS then we’d be hearing a lot more about QGIS crashing (bitchy and moaning GIS people remember), my second guess would be that we’d be waiting a fraction of the time to see these issues resolved.

Software, proprietary or open source has its issues. Personally I like the idea of being able to resolve defects or amend the software myself (if I have the skills and the need to do so).

4. Home Brand GIS

Heinz baked beans are always front and centre when I buy my beans at the supermarket – and I buy Heinz (English variety of course). Now I won’t go into why I buy Heinz too deeply, but I suspect it’s something to do with child hood memories and a conditioning (see No. 1 above). It’s probably also fair to say I’m a sucker for packaging, so when i glance down at the bottom shelf at the home brand beans I’m simply unimpressed by the label, irrespective of the fact that I know the beans are good quality.  Open Source GIS in some respect has a home brand feel about it to me – I know its good quality, but the packaging can be a bit shite. Now if you’re not a shallow, little brained brand man like me then you’ve probably be eating home brand for some time and can’t get enough – you don’t care about the packaging you just love them beans.

But what does the packaging matter? Well firstly it matters only when the product and service that it is wrapping is good quality – if you’re brand loyal for a product and service which is inferior then you’re more of a dickhead than me. When I’ve moved across to open source projects like QGIS, GeoServer and TileMill recently, I’ve had to (in some cases) put aside the packaging and judge the results against what’s in my Esri toolkit, and I’m liking what I see. Once you’ve established an understanding of product quality I think its fair to look at the packaging -   is it mature? Does it have good support? (perhaps commercial if required). Does it have good documentation? Can I get my hands dirty if I need to? What is the user exeprience like?

5. Esri has given me pre-conceptions about GIS

Esri has given me preconceptions about GIS. I’ve never been particularly comfortable with Esri’s terminology such as the ‘geodatabase’ or ‘enterprise GIS’ or ‘geodesign’. I can’t help feeling these terms are manufactured. Going outside the Esri world for the first time in years has made me feel like I’m learning first principles again, without my Esri baggage. I don’t wish to spread FUD and perhaps Esri is legitimate in its desire to talk openly about the problem spaces which GIS can solve – however I feel now that when I look at these problem spaces I’m no longer focusing on an Esri technical approach to solving them, but rather looking at them with a much wider lens. Personally as a GIS consultant its no longer acceptable to me to restrict my view, and Open Source GIS is helping me open my eyes.

6. Organic GIS Software

After a while working with Esri software and for a distributor of Esri software I felt like I was going round and round on a mouse wheel, an endless cycle of release, service pack, service pack, (promise performance improvement), go to conference, cram more features in and start again. This is after all commercial software – you have to feed the beast right? Yes -  in amongst this the products evolved and improved – no argument or criticism, but it’s not for me. Somewhere along the line I felt that this endless cycle and growth begins to erode the software. Endless extensions, feature bloat, UX problems, quality issues, multiple divergent APIs, all constantly accruing technical debt.  Again don’t read this the wrong way – these problems are not unique to Esri, they seem to me to be a condition of growth, and perhaps like battery farmed chicken, there is a drive to keep your costs down and margins up and hey’ the punters just keep coming back for that tasteless bird!

Open Source GIS software I would argue comes from a different eco-system, where there are multiple smaller projects and competition thrives and drives improvement. Despite my comments above about quality – on a whole I’ve found the quality of open source GIS software to be as good, if not better than similar commercial products (See Update *). This eco system is maturing to provide choices in terms of software support and development and demand is increasing. I love the lack of marketing and sales tripe in Open Source too. Its straight up, no bullshit, does what is says on the tin, wholesome GIS.

Footnote:

The title for this post came from the following story:

Back at the 2009 Foss4g Volker Mische had just done a great presentation on CouchDB and was taking questions from the floor. A legitimate question came forth from an individual about how one could represent relationships a la RDBMS in these new fangled NoSQL DB’s, before Volker could answer the question, a friend of mine shouted from elsewhere in the crowd:

“Hey man, you’ve just got to leave all that relational baggage at home, man”

Update

* Thanks for the feedback from Anthony – I have no quantitative evidence to support this claim. FUD, guilty as charged.

]]>
http://mapbutcher.com/blog/2011/12/16/my-baggage/feed/ 7
GIS on the web is OK….sometimes http://mapbutcher.com/blog/2011/10/04/gis-on-the-web-is-ok-sometimes/ http://mapbutcher.com/blog/2011/10/04/gis-on-the-web-is-ok-sometimes/#comments Mon, 03 Oct 2011 20:40:47 +0000 http://mapbutcher.com/blog/?p=789 Something caught my eye the other day – it was a tweet about the usability of GeoExt as toolkit for building web mapping applications.

In the past couple of years it’s been popular to criticise web mapping applications which were\are designed based upon the expectations of desktop GIS users. Before we all jump on the band wagon we should remember that these were expectations built by the GIS industry advancing towards the web – vendors sold server software based upon the ability to share the power of GIS to the masses. Many of these applications were built by those that now openly criticise them. In some cases their critique has been justified based upon a failure to captivate a larger audience with these applications.

In the past 5 years in the GIS industry we’ve measured our success perhaps most frequently against Google Maps – have you ever heard a client or customer say “I want it to be as simple as Google Maps”? When we use Google Maps as our measuring stick we are looking at an application which has had universal success. However in our rush to disparage the last 10 years of web mapping applications in favour of simpler more targeted applications have we forgotten something which is valued?

Geographic Information Systems

Much of the criticism levelled at these ‘old school’ web mapping applications is justified and in my opinion directly caused by a belief that it was the duty of GIS teams to produce solutions we spent hours building endless applications which would fulfil the needs of our users, many of them with shocking user experiences. Driving these applications was a demand, a demand for tools to allow people to do their jobs more effectively, for these people, the applications are not GIS applications but a means to an end – in many cases whilst our newly found love affair with the aestheticly desirable make these applications seem tired, cumbersome, and overloaded, they are still used and useful to many people outside our own little world.

These applications in my mind are Geographic Information Systems – they are applications often dedicated to the sharing of spatial information and tooling – they may have business specific tooling incorporated, but they remain essentially, at their core, GIS. Discounting these applications in some way is discounting the usefulness of GIS (irrespective of their relative user friendliness). Personnally I hate the classic table of contents, I hate the fact that some people put search interfaces on top of table of contents! There are so many things I hate about these traditional GIS user interface elements, but my objections often stem from these elements being a barrier to wider adoption. My mother for example would be lost if I asked her to ‘select’ something on a map – but my mother isn’t using these applications.

There are many faults in these all encompassing monolithic web mapping applications – however I’m not convinced that the rise in replacement applications are improving the users lot. Are they essentially putting lipstick on the pig? Ensuring the application has rich capability may in fact be a valid use case based upon the needs of your audience, don’t then remove it or replace it with glitter because you can! I don’t think it is unreasonable to discount the prospect of applications that were once the domain of the desktop being available within a browser, on the other hand its not always necessary or sensible to have everything in there!

Targetted & Disposable mapping applications

The nature in which large organisations deliver and implement systems has to change – our expectations of how we interact with software is changing, think about how disposable an application is on your phone, I have downloaded applications, used them and disposed of them within the duration of a train journey home from work. Any system which takes two years to roll out is a system and an organisation which is failing. These new ‘disposable’ applications often have less scope, are targetted towards the presentation of information in such a way as to deal with a smaller problem space. They are clear, uncluttered and aestheticly well designed.

The very nature of IT System procurement and implementation can shape the outcome – the tendency towards building the all encompassing application can be caused by a desire to have a solution in place for x number of years – a solution which will satisfy all ‘stakeholders’ – a solution which is built around a strategy of expansion and compromise. The days of expensive web mapping frameworks is over (or at least it should be) – there’s nothing wrong with a web mapping framework which delivers configuration, ease of administration and rich GIS tooling, but don’t pay through the nose for it – why – because they have a limited audience and lifespan.

A good recent example of an web mapping application in this space is the Atlas of NSW. The first thing I like about this site is its name – its an Atlas – no fussy “geo-bollocks-portal’ nonsense – nice and simple – everyone knows what at Atlas is! I also like the way it makes decisions for the the user, you’re not overloaded with too much, you can change a map and but you don’t need to understand anything about layers. Even simple things like the extent of the map when the application starts is correct – this is a map about New South Wales so they don’t start the map at a scale showing the whole of Australia.

Information Systems

The Atlas of NSW application demonstrates a web mapping application which is centred on the map – I like this but it’s a very map centric view of information. There is a growing number of ways in which data can be presented and visualised and mapping is just one of these. That phrase ‘spatial is special’ is naval gazing tripe and more recently I’ve seen great sites which treat spatial information no differently to other data, these sites use maps as one mechanism to guide the user through a story. This is a much more developed view of information where presentation does not take precedence. I heard a great comment last week about letting the data speak for itself – I thought it was a wonderful way of describing what we should do more of. A couple of nice examples of sites which are taking this approach of combining maps into a broader information system are the NAI Violence against Journalism in Afghanistan site and the National Broadband Map site.

There’s certainly a move towards using maps as a visualisation mechanism in broader information systems and thankfully traditional mapping applications are moving past *Portals* and “GeoGuff* sites into something a whole lot more palatable for anybody to understand and use, but lets not get swept up and totally dismiss the use case for Geographical Information Systems and yes – sometimes its Ok to do this stuff in a browser, have tools on a toolbar and a table of contents (….I can’t believe I’m saying this!)

]]>
http://mapbutcher.com/blog/2011/10/04/gis-on-the-web-is-ok-sometimes/feed/ 11
Brewing Maps with TileMill http://mapbutcher.com/blog/2011/10/03/brewing-maps-with-tilemill/ http://mapbutcher.com/blog/2011/10/03/brewing-maps-with-tilemill/#comments Mon, 03 Oct 2011 10:04:53 +0000 http://mapbutcher.com/blog/?p=791

The Melbourne Open GIS meetup has been growing for a little while now and last Thursday it was great to see our biggest crowd yet assembled at OpenHub for some beer and banter. As usual it was a good melting pot of folks who share a common interest in maps.

After some shoddy technical ramblings on my part and a last minute dash to the *JOHN* (who incindently could have been the first person to invent tiling…) I did a short 50 metre sprint on TileMill – the slides are below, but you should probably just skipt them and download it.

A big thank you to all those who came along, looking forward to the next one.

Simon

]]>
http://mapbutcher.com/blog/2011/10/03/brewing-maps-with-tilemill/feed/ 1
Methodology Smells http://mapbutcher.com/blog/2011/09/07/methodology-smells/ http://mapbutcher.com/blog/2011/09/07/methodology-smells/#comments Wed, 07 Sep 2011 12:22:46 +0000 http://mapbutcher.com/blog/?p=747 In this post I’d like to provide some guidance when selecting consultants to do your project work, specifically I’d like to discuss the various methodologies that consultants say they adopt when delivering projects. First off I’ve got to admit (..before I start ranting) that I’ve spent a number of years with my head lodged firmly up my own fundement with regards to software delivery process and methodology, and it’s only now that I may write with some perspective, my true thoughts on process.

Generally when engaging with consultants the matter of how the work will be delivered will be discussed. This, after all is an important criteria in your assessment of suitability, however the following points are aimed at shining a light on some of the things you may wish to look for, and avoid when making an assessment:

…things to avoid….

Tripe

I recently read the following from a ‘leading’ consultancy (…isn’t it funny how many consultancies are ‘leading’ – perhaps this will be the focus of another post – what does it really mean to be ‘leading’?)

Included in <<X methodology>> is the detailed <<X>> Information Management and Performance Management Framework specific to the delivery of Performance Management strategy and solutions and the underlying Information Management structure a pre-requisite for effective Performance Management.

God knows what this means! Frustratingly,  bloated consultancies and their tripe marketing departments churn this nonsense out by the bucket load.  Alas some poor soul will find themselves sucked in by this, perhaps because they’ve been reading  about other ‘pre-requisites’ which are apparently required? In fact much of this methodology bullshit is built upon a basic fear factor. You must do ‘this and that’ or several horrific things will happen to your project. If you don’t pay the exorbitant daily rate for an individual who happens to know how to open Microsoft Project (..and who will help you conquer these many pre-requisites) hell will open its gates and swallow your project up whole! In many cases you’ll find that your $$ will translate to more documents littered with nonsensical language which will sit on your shelves for ever more and will have no bearing on the success of your project. This is classic methodology tripe, if you happen to like your tripe then suck it up.

Diagrams

Its almost unheard of that a methodology doesn’t include a complex diagram of some sort – the trend seems to be the more exotic and incomprehensible the diagram the more valuable the methodology, or so we would be lead to believe. These diagrams are aimed at trying to take the complex and distill it down into a single sheet of A4, a quick view on how it will all work, but software projects in particular are simply not like that. I can’t remember a single project I’ve ever worked on which fit neatly into one of these diagrams. Don’t be fooled by them, whilst diagrams are a useful way to illustrate a process – they should be taken as such and certainly not viewed as a sign of process maturity.

My *favourite* diagrams are those that involve wheels….you know you’ve got a winner when they design a process around a wheel.

Zeitgeist

Many software delivery methodologies are reactions against the status quo. Many are justified reactions based upon existing processes not working, however with each new methodology comes a flurry of consultants jumping on the band wagon, promising a ‘new and improved’ way of doing things – don’t ignore this, just apply caution, and like any good analysis, scratch the surface a little and find out just how much reality is behind the claim. For example I’ve been amazed by how many interviews I’ve conducted in the past couple of years where the interviewee has claimed experience of agile methodologies yet when probed have demonstrated very little understanding – like any skill, claiming experience and expertise in the use of a process or methodology should be carefully assessed.

Hybrids

“We took the best of breed…”

This sort of statement can often be attributed to methodology fashionistas (see zeitgeist above). I’m not a big fan of hybrid methodologies. A consultancy putting forward a hybrid methodology is, I suspect a consultancy trying to placate conflicting camps and too inexperienced or unsuccessful to base their approach on a single proven method. I’m sorry, but I find it tough to understand how waterfall and an agile approach can be used on the same project. I’m sorry I don’t think that story boards and burn downs sit neatly alongside GANTT charts. To me hybrids smell of confusion and I would steer well clear.

…….look for and remember……

A means to an end

As I mentioned I have spent some time being a little process obsessed and at times I’ve lost perspective on what it is I was actually aiming to do – choosing the right process is important but ensuring that it doesn’t own you is as important. There are two parts of the agile manifesto which I really relate to:

  • Individuals and interactions over processes and tools
  • Responding to change over following a plan

To some extent I think some agile practitioners have forgotten the first statement above – that the doctrine of Scrum or Kanban (..or whatever) has become more important than people or communication. Or the use of a particular piece of tooling has strongly dictated how a piece of software is delivered. The second statement means so much to me because its simply reflects delivery reality – things change. Look for a methodology which can react well to change and avoid those which take no account of change or perhaps view change as something which should be penalised.

Demand failure

Critical to any good methodology is an understanding of failure – I think its invaluable for any consultant to have an appreciation of their methodology based upon hard experiences. Ask the question ‘Have you ever found yourself in the trouble on a project?’ Ignore people who lie and say no, but listen carefully to when people are truthful and follow up by asking whether a process or methodology helped them and why.

Don’t Scrimp

I may sound like I’m bagging process, but that’s not the case, I’m bagging waste! You want your project to be successful. Without a path to follow it will be a very strange experience. Honestly ask yourself about where a process will and won’t add value, think of your past project experiences. Think about the many projects that were started and never finished and ask what could been done differently and what could help this time around. The software landscape changes so frequently that it’s almost impossible to consider a static approach which fits this ever changing picture. One size does not fit all and the nature of your project will dictate the nature of the process needed to see it delivered. Its fascinating to see the changing way in which new technology is being rapidly delivered to clients, using a much greater degree of client engagement, multi-disciplinary teams and a general adoption of leaner and meaner approaches – understand how it can help you and take advantage of it.

 

]]>
http://mapbutcher.com/blog/2011/09/07/methodology-smells/feed/ 0
Ignoring properties when serialising to Json http://mapbutcher.com/blog/2011/09/06/ignoring-properties-when-serialising-to-json/ http://mapbutcher.com/blog/2011/09/06/ignoring-properties-when-serialising-to-json/#comments Tue, 06 Sep 2011 10:24:10 +0000 http://mapbutcher.com/blog/?p=754 This afternoon I was figuring out an interface for a service and I was trying to establish a nice way of exposing an object as a Json parameter in .Net. The object I wanted to work with is a standard .Net class, however it has properties that I didn’t want to confuse the user with so I needed to somehow hide those properties when serialising the object. Here’s the method I used;

I am using Json.Net to do all my Json work – its really simple to serialise, deserialise a class – for example the class I’m working with is the SyndicationLink class, to serialise this class I can simply do this:

var json = JsonConvert.SerializeObject(new SyndicationLink());

The output would look something like this:

{
  "SyndicationLink": {
    "AttributeExtensions": {},
    "BaseUri": null,
    "ElementExtensions": [],
    "Length": 0,
    "MediaType": null,
    "RelationshipType": null,
    "Title": null,
    "Uri": null
  },
}

That’s all cool, but I don’t really want my interface object to so closely relate to the underlying implementation, it would be nicer if I could rename the object and hide a few of the properties which I won’t expose. Json.Net provides an attribute called JsonIgnore – which you can use to decorate a property, when using the JsonIgnore attribute the property will be ignored when deserialising the object. Ok that’s just what I need, but I also want to make use of the standard .Net class – so how can I do this?

 public class Link: SyndicationLink
 {
     [JsonIgnore]
     public new Dictionary<System.Xml.XmlQualifiedName, string> AttributeExtensions { get; set; }
 
     [JsonIgnore]
     public new SyndicationElementExtensionCollection ElementExtensions { get; set; }
}

Firstly I inherit the .Net class, hiding the properties which I don’t want to be seen by using the ‘new’ keyword when declaring identical properties in my wrapper class. I now have the ability to add the JsonIgnore attribute to my new properties. This results in a cleaner Json object when serialised:

var json = JsonConvert.SerializeObject(new Link());
{
  "Link": {
    "BaseUri": null,
    "Length": 0,
    "MediaType": null,
    "RelationshipType": null,
    "Title": null,
    "Uri": null
  },
}

When my service receives this object it can simply de-serialise it:

Link myLink = JsonConvert.DeserializeObject<Link >(JsonString);

I can now work directly with the .Net class without any serious heavy lifting needed to convert the object in and out of the service.

]]>
http://mapbutcher.com/blog/2011/09/06/ignoring-properties-when-serialising-to-json/feed/ 0