Patrick Cuba – THATCamp St. Louis 2013 http://stl2013.thatcamp.org Tue, 19 Nov 2013 19:09:20 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.12 SLU Center for Digital Humanities http://stl2013.thatcamp.org/2013/11/08/slu-center-for-digital-humanities/ http://stl2013.thatcamp.org/2013/11/08/slu-center-for-digital-humanities/#respond Fri, 08 Nov 2013 23:30:39 +0000 http://stl2013.thatcamp.org/?p=259

Continue reading »]]>

In the last few years, Saint Louis University has generated some great tools and projects for research. At the least, I am willing to share T-PEN our tool for transcription of digital images and discuss/demo the Tradamus tool that is forthcoming (April 2014) that seeks to be a modular, but end-to-end solution for creating a digital edition.

Probably, I will also talk about vHMML, which is an online resource for learning coming out in Spring, and a few other projects I can’t commit to text, but cannot keep secret.

In a perfect conference, these demos will only serve to show what chasms are left in tool development as I seek to find the next great need and expand these tools into other fields.

Every project I have worked on has been in collaboration with at least two other institutions and began with a sturdy “This would be cool, if it were not impossible” conversation. I want to finish another one of those conversations, but I would be happy to start one.

]]> http://stl2013.thatcamp.org/2013/11/08/slu-center-for-digital-humanities/feed/ 0
Attribution and Collaboration http://stl2013.thatcamp.org/2013/11/08/attribution-and-collaboration/ http://stl2013.thatcamp.org/2013/11/08/attribution-and-collaboration/#respond Fri, 08 Nov 2013 22:49:29 +0000 http://stl2013.thatcamp.org/?p=256

Continue reading »]]>

I am pretty sure there are no technological hurdles left to crowdsourcing everything.

As digital editions and big data projects begin to allow deeper access to their processes, citing the contributions made by those from whom you have lifted already assembled datasets or important cataloguing conventions becomes very difficult, but can be glossed over without consequence in most applications. When these important micro-contributions come from hundreds of people across several disciplines and a range of credentials, the task becomes near impossible and much more important.

Massive sites like Wikipedia have developed conventions (in addition to their official flags, stubs, and citation formats) that discern between contributors who share knowledge on a topic, those who flit about correcting spelling and grammar, and those who seek out citations to flesh out incomplete articles. Is this folksy approach the future? Can and should the value attached to someone who applies professional polish to a scholarly article be different from the workhorse who dropped a mangle of data and conclusions into a public area? Is the artist who created the visualization that makes it all accessible simply an illustrator?

Got me.

]]> http://stl2013.thatcamp.org/2013/11/08/attribution-and-collaboration/feed/ 0
Interface Design is a Waste of Time http://stl2013.thatcamp.org/2013/11/08/interface-design-is-a-waste-of-time/ http://stl2013.thatcamp.org/2013/11/08/interface-design-is-a-waste-of-time/#respond Fri, 08 Nov 2013 22:33:29 +0000 http://stl2013.thatcamp.org/?p=254

Continue reading »]]>

When I first began work in Digital Humanities, I was too ignorant to anticipate how often I would be told this as a developer. So often, it turns out, that I’d like to start an argument.

It seems that a digital humanist who is capable of programming prefers the command line, where she can break into anything she wants. If she is smart enough to research and program, went the reasoning, she is clever enough to decide what customizations make the perfect tool for her research. On the other side, at an institution with the resources for great minds and strong technical support, the most erudite researcher can configure a task so precise that even an uninitiated programmer can run an appropriate analysis and return wonderful data or at least a helpful visualization to the (often digitally hypo-literate) taskmaster.

The pyramids and evolution have shown that if you throw enough bodies at something, it will get done, but a tool is something special. Every craft has a rich history of interplay between those who pushed the limits of possibility and the new designs that made sure that limit was ever-expanding.

Digital Humanists represent a very different audience from most web design or software projects – an opportunity that is often missed. Time taken to restrict bad data input causes conflict with data models that may be based on now incomplete or incompatible scholarly conventions. The interface is adjusted; the tool is improved. This loop creates a tool that generates better data and more completely describes the scholarly work while simultaneously creating (de facto) or reinforcing data standards. The result is well-composed knowledge that is completely portable, dissectable, criticizable, citable, and reusable.

This success is powered by the scholar, sharpened by the focus of the designer, and accelerated by the tool born from the interaction between them.

I am happy to share my experiences with people beginning projects. I am very interested in hearing from others who have completed reusable digital humanities tools and discussing what sorts of possibilities exist in disparate solutions applied to emerging problems.

]]> http://stl2013.thatcamp.org/2013/11/08/interface-design-is-a-waste-of-time/feed/ 0
XML, OAC, RDF, JSON-LD and the king stood: the universe is metadata http://stl2013.thatcamp.org/2013/11/08/xml-oac-rdf-json-ld-and-the-king-stood-the-universe-is-metadata/ http://stl2013.thatcamp.org/2013/11/08/xml-oac-rdf-json-ld-and-the-king-stood-the-universe-is-metadata/#respond Fri, 08 Nov 2013 22:08:33 +0000 http://stl2013.thatcamp.org/?p=251

Continue reading »]]>

The Open Annotation Collaboration published a data model in February that should be recognized as disruptive. JavaScript Object Notation for Linked Data whose 1.0 specifications landed November 5th (Tuesday) is another in a cluster of W3C standards that show digital objects are beginning to exist as real things that try to completely represent tangible artifacts and not simply a new whizbang-computery way to offer a limp reference to something real.

Markup, I insist, was the necessary jolt to encourage machine readable encoding. XML is a convenient vehicle to bridge relational tools and linked open data(LOD) (or any triple), but the weight limit of RDF-XML has been exceeded. The standards for annotation are necessary for interoperability and the exposure/discovery of LOD, but is also a very useful way to work with offline, local, or private/siloed data.

I am able to share experience with OAC and the manuscript-focused children of OAC, SharedCanvas and IIIF. These standards were emerging as the transcription tool T-PEN was being completed – it has allowed us to include features that were previously unplanned and filled me with healthy discontent at its completeness. Our current project, a tool for the complete creation of digital editions (focusing on manuscripts) makes heavy use of these standards and is dangerously near spawning a few of its own.

I would like to learn about other efforts in annotation, especially in fields outside of manuscripts. What already exists, what is in flux, and has this shift impacted the way you organize data?

At the very least, I would like to debate whether annotation is a fad or there is a real possibility that markup will get out of the way and we may be left with a single pristine artifact that takes the universe as its metadata.

]]> http://stl2013.thatcamp.org/2013/11/08/xml-oac-rdf-json-ld-and-the-king-stood-the-universe-is-metadata/feed/ 0