Vozes

Vozes que ouvimos a noite quando estamos sozinhos. Vozes que gritam dentro da nossa cabeça quando não queremos ouvir. São vozes que nós dizemos sem dizer, vozes que prendemos por dias, meses e anos…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Decentralized Data Science and the Ghostbuster of Starbucks

A philosophy and stories of shifting perspective in the Era of Data

My notes and the post-talk interview with him on “Why BlockChain is the Future of Data Science” deserved an article.

“Data science” is an overused and ambiguous term.

We’re collecting more information about more of everything than ever before.

Luciano then asked, “What happens when data isn’t centralized?”

The question hung heavy in the nerdy air. We all needed a story to imagine what he meant. So he told us about his encounter with a Ghostbuster at a random Starbucks.

One afternoon, Luciano sat drinking coffee and being all the things he is, in a Starbucks. A Ghostbuster entered the shop. He wore a proton-pack, carried a laser wand connected to the pack, and began scanning the space. He looked exactly like Igon.

When Luciano asked, the Ghostbuster explained — he was collecting internal maps of every Starbucks in the world. They would all be mapped for accurate upgrades and adjustments. Want to plan a network-wide upgrade of every espresso machine around the world? There’s an accurate count and location of every eligible machine to plan and execute replacement.

The Ghostbuster was “quantifying the world in real-time.” He was building the initial, universal blueprint for a Starbucks, updatable as the individual layouts changed. It wasn’t that the process was completely “decentral” (they fed into a single database) — it showed the movement to a new way of modeling the world, of creating and using data.

“Only the govt can afford to collect data“ was the old quip. Government satellites, surveys, and tracking have given us immense stores and feeds of “free” information. The private sector is now exploding though, following a decentralized approach to surpass the collection volume and quality.

Supply chain modeling — only collects data at time of delivery of a commodity. We need a model that picks up and integrates continuously, uses simulations with sampling to predict.

To Luciano, Blockchain is the “Collective Memory.” It gets better with more things quantified and into the system/chain. With current computing, adding big data sets slow the chain down — but this is a limitation constantly being solved.

I share the optimism and feel a sense of relief and validation everytime a chain is used for something novel and functional to society — i.e. more than settling a few transactions and uses insane amounts of energy to solve algorithms. Current blockchain tech is still undersold (and unproven), despite all the hype. The biggest challenge is getting people to give up data to prove this decentralized data science can work.

Luciano referenced channeling your inner Kanye, i.e. don’t give a f***. It made me think how much authenticity and honesty is the long game of real trust. People will be thicker skinned to start and less insecure — which is why we need all this encryption and consensus in the first place.

He gave an interesting example from a community bonded by a health condition:

Add a comment

Related posts:

How to enable Continuous Integration and Continuous Deployment Process Using GitHub Actions and Azure

For every lover of DevOPs, it’s always interesting to try out different DevOps tools in the ever numerous and changing DevOps world. Once this is completed, go to the repo settings, click on Actions…

How to use Leetcode Effectively

I am Persistent Programmer and I will be posting computer science and algorithm content on Medium.. “Hello World” is published by Persistent Programmer.