Sauf mention contraire dans les contenus, l'ensemble de ce site relève de la législation française et internationale sur le droit d'auteur et la propriété intellectuelle.
James Fee GIS Blog
He’s only running for president, close enough I say… Connecticut clearly wants to be close to Canada. And lets be honest, Massachusetts was too large anyway. It’s more manageable now.
I made a story map today. The process is a bit rough on the edges but I worked through it. I’ve used more Esri in the past month than the past 5 years.
I’ve been playing with ArcGIS for Server 10.3.1 at Matrix and we’re all about running things with hosted services. So rather than spec out some hardware and install ArcGIS for Server on local legacy machines, we’re doing it all in the cloud. Because I’m new here there wasn’t any legacy AWS use so I was able to pick Azure for deployment. My logic:
- While I’m experienced with AWS, Azure is mostly an unknown world to me. Given we’re running Windows servers with SQL Server, why not go native.
- I really want to give SQL Azure a spin.
- The portal for Azure is much nicer than AWS. They have those stupid panels in places1 but mostly it makes logical sense.
- Esri has Cloud Builder to simplify installation which I though would be great for starting up prototypes quickly.
So logical, no? Well late yesterday this tweet went out by me.
/me We should try ArcGIS Server on Azure /me Seems easy enough /process fail /me should have known
— James Fee (@cageyjames) November 5, 2015
I was stuck here:
You can literally hear the sad trombone sound. Now Sam Libby was helping troubleshoot but things were still a bit weird. Basically as you can see in the error above, I needed to accept an EULA. Now of course I went into the the Azure Marketplace and followed the instructions to allow the Esri VM to be deployed programmatically which is what Cloud Builder requires. But each time it errored out the same way.
Sam offered this:
— Sam Libby (@s_libby) November 5, 2015
Basically he hit upon it. Microsoft did something with the marketplace and for whatever reason the Cloud Builder app won’t install an Esri ArcGIS for Server VM until you actually install it first yourself.
The workaround to get the Cloud Builder app to run is actually just create a VM using the Azure Portal then delete it.
After that, the Esri Cloud Builder app runs perfectly without trouble.
Philip Heede basically confirms everything.
— Philip Heede (@pheede) November 6, 2015
So the ArcGIS for Server Cloud Builder2 works great. While I don’t like wizards in general, it automates the processes that take time and let’s you focus on the settings for ArcGIS for Server you want to change. I honestly haven’t installed ArcGIS for Server since it was ArcGIS Server (without the for) 9.3.1 and it was interesting to see how things have changed and how little has actually changed.
“By dropping ‘points’ on a map within the Survey App, you indicate your position within the venue as you walk through,” reads the app description. “As you do so, the indoor Survey App measures the radio frequency (RF) signal data and combines it with an iPhone’s sensor data. The end result is indoor positioning without the need to install special hardware.”
Interesting in the sense it appears to be an app that stores can use to map their interiors with iOS devices. It’s not a crowd sourced indoor mapping application. This dovetails nicely with the other announcement this morning about their new Maps Indoor service.
For now, Apple is focusing its efforts on a handful of venues that meet specific criteria. These requirements include:
- The venue must be accessible to the general public
- Only locations that draw more than a million visitors per year
- Apple requires “complete, accurate, and scaled reference maps” for consideration
- The venue must have Wi-Fi throughout, and an official app available on the App Store
The groundwork is set for Apple to start mapping interiors of these large open venues. But with an app and an iPhone, clearly Apple is planning to scale this out to just about every indoor location. I suspect we’ll see stadiums, amusement parks and other entertainment venues appear first over the next year.
So you may have seen last week that I resigned from AECOM.
Today is my last day @aecom
— James Fee (@cageyjames) October 16, 2015
Well I’ve ended up at Matrix New World Engineering as the National Practice Leader for GIS and Geospatial Services. I’m going to miss the guys at AECOM and working as Project Manager on the BLM Navigator1 data sharing portal but the opportunity with Matrix is something I could not pass up. In a twist, I will be working more closely with Esri technology. That means you’ll see me blogging more about Esri again. That said the first program I bought at Matrix was Safe FME Desktop so you can see my overall goals aren’t changing.
Tied in with this is ArcGIS for Server on Azure. I’m jumping in with two feet it appears. But don’t worry, you won’t be seeing any ArcObjects or Dojo posts from me. It’s interesting to try to get back on where Esri server software has gone over the past 5 years I’ve ignored it. Google searches of course make me laugh a bit.
Trying to research ArcGIS Online but all I find is inflammatory blog posts by me. #circularreference
— James Fee (@cageyjames) October 23, 2015
Now I had promised Hangouts with James Fee starting back mid-October. Well given my job change it was difficult to get that started back up. It’s being pushed back to November and my first guest will be the always interesting Ian White. Stay tuned for the schedule.
Hangouts with James Fee Season 4 starts soon. We’re going to continue as a podcast so make sure you stay tuned for the guests. I figure it is fitting that the last video hangout I had with Steve be posted as the “first” podcast. Enjoy and I’ll see you all after my honeymoon.
I’m getting married next week so I’m going off the grid but when I get back Hangouts with James Fee returns for Season 4. The big change? It’s a podcast.Hangouts with James Fee
Back in June, Mapbox received their Series B round of $52.55M. With that, Mapbox has turned up their development on just about everything and had a grand old-time at the Esri UC buying just about every advertising space outside the San Diego Convention Center. At the time Eric said:
We’re creating the building blocks for a complete mapping stack. This extends way beyond a map.
Today CartoDB has announced their Series B:
We are excited to announce the close of $23 million in Series B financing to expand CartoDB’s mission, enabling anyone to map their world’s data and leverage the power of location.
This moment is truly important because it sends a strong message about the location intelligence revolution as renowned investors validate our position and direction in this growing market. I would like to acknowledge the hard work done by many people in the company in the process — you guys rock!
Spatial IT is hot stuff right now. That’s about $75M in Series B funding in little over 2 months. Bubble? Probably not as you can’t really say either company hasn’t developed a business and has important clients. 5 years ago there were much more spatial startups running around trying to get money, from GeoIQ and SimpleGeo to WeoGeo1 and Geoloqi all received some funding but ended up being acquired mostly for staff or client lists. CartoDB and Mapbox are of a different beast and their sustainability has been rewarded by large investments.
So what’s going on with CartoDB now? It appears they’re going to continue investing in the core product and make it bigger and easier to work with. I’m excited for them, the elephant is going for a ride!
10 years ago, Google Earth was still somewhat unknown. It had its big coming out party with a natural disaster1 and people started doing amazing things with it. If there was one person back in 2005 that knew XML spatial formats, it was Ron Lake. He wrote a commentary on KML 10 years ago this week. I for one read his article with 10 years of time to think about they implications of KML and see why from his perspective KML was not able to handle his needs.
Back then we all thought KML was the future and there wasn’t much that couldn’t be done. I think now we all realize that KML is the new PDF except we knew that 10 years ago. XML of course is never the answer…
In light of some recent questions, I would like to take to a moment to discuss what the GeoGig team is up to and what additional steps are being taken to prepare for GeoGig 1.0. To frame this discussion I will highlight two steps (rebase and tag) that are used to prepare a data product for release in GeoGig.
I wrote this article last week about version control and GIS and noted that I thought that GeoGig is not dead. Clearly that isn’t the case and it appears that development is ongoing, just not on GitHub directly which I believe is some of the confusion. We’ll all keep an eye out.
Who can go to a spatial conference and not see the GIS superhero t-shirts, the presentations with superhero references and job ads claiming that if you work for them, you’ll be a GIS superhero. It’s awkward right?
But where did this superhero meme come from? I’ve been trying to determine what set this all off and I think the first modern1 reference to a GIS superhero was Safe Software back in 2011.
I’m guessing this showed up because of the FME 2011 beta screen.
But this all turned crazy when in 2012 at the ESRI UC, MapMan arrived apparently created by Daniel Gill. After that moment, Esri conferences and meetups all had MapMan/MapGirl swag to buy or give away. And with that GIS people started referring to themselves as superheros, hiring superheros and the rest. I would have though this meme would die a quick death because it’s a bit ridiculous but if there is one thing GIS people love to do is oversell themselves. Just use hashtag #mapman to see it all on your favorite social media network.The GeoMonkey never had a meme. You left him behind.
I’m sounding like a history professor here↩
Before the advent of GPS and navigation apps, cartographers sneaked “paper towns” and “trap streets” into their maps—fake points of interest that they used to detect plagiarism. If someone copied their map, it would be easily identifiable through the inclusion of those locations. That same trick has found its way into modern-day mapping systems: A new lawsuit brought against Google and its traffic app Waze cites sham points of interest as evidence that the Google-owned service copied from a competitor’s database.
Apparently these two companies tried to make a deal before Google snapped up Waze and PhantomAlert is alleging that Waze used their database to “boost its profile”. One of the biggest concerns in the OpenStreetMap community is allowing these intentional mistakes into their database. Copyright Easter Eggs is well documented on the OSM website.
A Copyright Easter Egg, in terms of mapping, is a feature that is drawn in a distinctive way in order to help identify its original author. It may be a nonexistent, or slightly or heavily distorted, map feature, or its name may be wrongly or unusually spelt.
The supposed main purpose of such a feature is to strengthen the author’s case in a copyright dispute. If he can show that his own unique feature appears in the defendant’s work, it is easier to prove that the defendant’s work is a copy of his.Hey look, I got to use the new Google logo already!
Yea so if this is true, PhantomAlert has a pretty good idea that Waze stole their data and it could mean big trouble for Google. Having a closed database like this opens Waze up to these kinds of lawsuits because they are unable to have the community police the data. The big question is was this data imported into Waze intentionally or by accident. I don’t think the latter will get them off the hook but if there was intent it could be costly. We’ll have to see. The Waze byline about “outsmarting traffic, together” might not be too smart.
Seriously right? About time! Most users of Google’s Maps API are caught between the free tier and the “I wish I had a business model to pay for the premium tier” pricing. But Google has figured this out and introduced a new way to pay for the Google Maps API.
Today we’re introducing a simple and flexible option for developers to instantly and easily scale with these Web Service APIs, by opening them up to pay-as-you-go purchasing via the Google Developers Console. In this new purchasing structure, the Google Maps Geocoding, Directions, Distance Matrix, Roads, Geolocation, Elevation, and Time Zone APIs remain free of charge for the first 2,500 requests per day, and developers may now simply pay $0.50 USD per 1,000 additional requests up to 100,000 requests per API per day. Developers requiring over 100,000 requests per day should contact us to purchase a premium licence.
This is huge because now you’ll know what you’re paying for the API rather than wait for that huge bill at the end of the month. Knowing what things are going to cost is key to building spatial applications. Provide your billing details and build away!
Yesterday I posted about Chris Hogan’s walk-through of generalizing data in PostGIS to make it usable in a web app. Basically he went through the process of finding out what is the sweet spot of quality vs speed. But there are other ways to accomplish this. Mapbox happened to post about a new library called geojson-vt.
Let’s see if Mapbox GL JS can handle loading a 106 MB GeoJSON dataset of US ZIP code areas with 33,000+ features shaped by 5.4+ million points directly in the browser (without server support):
Wait, what?! A few seconds loading the data, and you can browse the whole data set smoothly and seamlessly. But how exactly does that work? Let’s find out
So that’s actually pretty amazing. We all know what GeoJSON does in the browser and how it impacts the speed of maps drawing. 100 MB+ data rendering so quickly? Impressive. Read the whole post to see how they do it and the details on how to start using it. The only limitation is that it requires mapbox-gl-js or
Mapbox Mobile1. UPDATE: Per Tom MacWright:
— Tom MacWright (@tmcw) September 2, 2015
Still this comes down to using tools that make your mapping products better. Maybe Mapbox does that cheaper and quicker than you could on your own. This kind of on-the-fly simplification is what we’ve all been asking for and Mapbox is really pushing the envelope. This could be what gets people to start using their platform.
which is actually a big limitation if you think about it↩
10 years ago this week Katrina had rolled in and there were lots of posts on Spatially Adjusted about Digital Globe and Google Maps imagery being updated for the flooding. But the post that caught my attention was this one on ArcScripts:
Can someone at ESRI please clean up the ArcScripts site? Plain as day on the ESRI ArcScripts upload page it says “Not for samples or demos of products sold at Web sites”. There are way too many products that are commercial in there and this latest one takes the cake. 15 days and then you have to buy it, what a joke. If you have to advertise, do it by buying ad space, not polluting the ArcScripts gallery. Geospatial Enterprises is off my list of companies I’ll deal with. XTools Pro 3.0 is also a commercial product that tries to get around by offering some free tools, but it too is just a demo. Someone over at ESRI needs to get serious about cleaning this junk up and off the ArcScripts.
I mean how shady was XTools Pro anyway? The original XTools on ArcView 3.x was open, free and a great tool. Then some guys basically rebranded it for ArcGIS Desktop and started charging money for it. Oh well, the madness of ArcScripts is over as well as the need for tools like XTools is over. Still funny to think this was how we shared scripts and applications back then. No Github or other platforms to help. Life was so hard back then and we didn’t realize it!
Indoor mapping is the white whale of our Spatial IT industry. We’re always reading about how our smartphones will lead us to the best deals or how I can find the specific nail I need in Home Depot without having to ask anyone or walk down every aisle. They key to all this is essentially iBeacon.
You can search Google News for all the latest excitement on the concept but essentially it is a way for your phone to know where things are and for the vendors to know where you phone is through Bluetooth. Imagine walking into a store and getting alerts about your favorite beer being on sale and then the ability to navigate directly there. Sexy right? Plus we’ve been anticipating this happening for years. Except…
Google was set to launch a new product that added context to one of its most successful apps, Google Maps. But earlier this year, it was shut down by Alphabet CEO Larry Page, according to people familiar with the project.
Google Here worked by sending a notification to a smartphone user’s lock screen within five seconds of their entering a partner’s location. If the user clicked on the notification, a full screen HTLM5 “app” experience would launch. Google Here would know when to send the notification via Google Maps and beacons placed in the stores of participating partners. Google planned to supply the beacons to partners for the launch, according to the document. The experience could also be found by going to the Google Maps app.
Exactly what we though everyone wanted. In testing the application was deemed too invasive and Google feared no retailers would sign up. That’s right, Google didn’t think could get their partners to install cheap beacons in their stores AND they feared they were too big brother. Seems weird doesn’t it, if there is one company that can get companies to spend money on ads, it is Google. And since when did Google ever think pushing ads on us was “invasive”?
The magic about Google Here1 was that you didn’t need an app running for it to work. Think about that for a minute, ads would appear on your phone based on where you where and you didn’t need to opt in to get them. Now we see why Google was very concerned that Here was going to get a large backlash. Being able to push ads on users would have been something they really could have sold well to companies, I’m not sure there would be any fear of companies not wanting to push ads on us.
Beacons are still very important to Google. Their Eddystone project talks about lots of uses of beacons but not for ad delivery. Clearly there was feedback on this project and it jolted Google out of their normal sell more ads business model. I think beacons will be very valuable as they start appearing in more areas, but I for one don’t need to get an ad for fabric softener every time I walk into a Target.
Here as in not Here that was owned by Nokia↩
I am working on a project that needs to display all the neighborhood polygons in Baltimore City at one time. The file is relatively detailed… which mean that tons of unecessary polygon nodes are being sent from the backend, when, at the zoom level and the level of detail the map users need, the high level of detail is a total waste.
While there are some great hosted options to serve up complex GeoJSON, most of the time it is better served1 to simplify your data. Unless you’re surveying or involved with some sort of lawyer, even a bit of generalization is a good idea with online mapping. Chris does a great job showing how you can modify the tolerance to get your results to look great and save lots of bandwidth. If you’re a generalization newbie, you should read his example and get a better understanding of how it works.
And if you’re an Esri user, the same concepts can be used in their stack as well.
no pun intended↩
Yesterday I had a long post about GIS and version control. I mentioned Git in the article saying how maybe in the future Git would work with GIS files. A couple of people mentioned an article written by Gretchen Peterson titled, “Huge increase in shareability by combining Git and QGIS“.This isn’t the version control you’re looking for…
Gretchen does a great job showing how you can manage GIS projects with Git and I encourage everyone to read it. But keep in mind it isn’t version control how I was talking. Git doesn’t understand shapefiles and other binary GIS files. It will show that a shapefile was updated, but it won’t show what you updated in it, nor will it help reconcile updates. Gretchen is using Git to help her share projects with others which it does a great job. But it isn’t geodata version control and she outlines that clearly in the post.
We all would love to see Github support shapefiles, but I seriously doubt it will ever happen. For how you can use Github and GeoJSON files which isn’t half bad.
Today’s HWJF planning staff meeting was full of new ideas. The biggest one was that HWJF becomes a podcast. One of the biggest feedback requests has been to offer an audio only version of the hangouts for those who want to listen on their smartphones offline. I’ve explored this many times and never really got a good plan in place. But given that HWJF isn’t really visual in nature (looking at my grill for an hour has to be taxing), we’re going to convert HWJF into a podcast for season 4 arriving in October. If there is a visual need to have video, we’ll have special hangouts on the YouTube. What this means is it won’t be live anymore so you can’t point out how wrong I am until after the faux pas has passed.
What it will mean is the podcast should be more consumable and usable by everyone. This is an experiment so we’ll see how it goes moving forward. It will also allow more flexible scheduling of the podcast so we can have guests on who can’t attend during work hours.