You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Dec 29, 2021. It is now read-only.
I wanted to address some concerns @blahah and @joehand had with regards to traction and exposure of the Dat project.
And I'd like to to this by presenting a project I know which had a similar evolution to yours: Vert.x
@blahah The Vert.x project, invented by Tim Fox, came along quite smoothly, with a steady 1.x, 2.x release chain. Tim as custodian had a very practical approach, hammering on KISS, LEAN and steady, healthy growth. Very comparable to want you want to see with Dat. Also their project organization was very similar. Things were added as needed, by people having time and opportunity. Everything fine!
But then - despite their careful approach - traction happened. People started taking note. They couldn't help it. And quickly problems started to amount. Code that shouldn't be there crept into vertx-core, architecture needed an overhaul, and more importantly Tim Fox - being the ultimate expert - was drowning in community support, which didn't help (I'd imagine e.g. @mafintosh already having quite a workload on these things). [Vert.x was going up quickly on Kay's 'Yikes' curve @jondashkyle ]
So with community involvement they decided on a clear strategy for a total reorganization, to keep their dedicated focus, productivity, high code quality, and guarantee useful community involvement. In short, they did their job well, and even survived Tim Fox leaving the project, without as much as a hitch. The result is there for all to see.
In terms of broader exposure to the public Vert.x is still quite low-profile, being silently integrated and embedded in a number of large projects, for dedicated purposes mostly. A kind of exposure the Dat team would feel comfortable with as well, I think.
Concluding: "Be prepared for traction when it comes"
Traction and exposure
Now @joehand besides the 'How much traction can we handle? How much should we handle?' discussion.. You rightfully say 1. 'Traction leads to exposure, and exposure leads to unwanted restrictions'. An interesting subject.
And vitally important, I would argue.
So are tickling questions like these:
Can we afford to be relatively small and under the radar for long?
How much exposure is required to survive?
What is our moral responsibility towards the technology in that regard?
What if our technology does not become the success we imagined?
"Finally, there are use cases we're interested in where popularity is a negative. Human rights defenders in countries with heavy surveillance need tools built on Dat or similar tech. But popularity will make it more likely Dat is blocked or measures put in place to make it more difficult to use (see bittorrent)."
I agree. But, unless you guys really mess up, negative exposure eventually coming to your project is inevitable. Its a given. And just like asteroids, tsunami's and earthquakes you should plan in advance, make preparations so you are ready when the moment is there.
Having a stable, theoretically unstoppable software at that time just doesn't cut it, as I'll explain next.
Concluding: "Be prepared for restrictions, they will come"
Minimum exposure
Consider that while you go steady and slow, technology in general and encroachment of internet freedoms may well be evolving at an altogether much more rapid pace.
And with this comes the same increase in ability by government and commercial powers to control and suppress ever smaller initiatives with relative ease.
And when dark powers strike, you'll be much weaker as a small community of color-bearded ;) good-hearted dev guys, than with a broad, vibrant community with representatives from throughout society, who are willing to defend you tooth and nail.
Another point. The above doesn't matter, right? "Our decentralization magic, once stable, is like self-replicating nanotech. Virtually unstoppable once we open the vial.". You may think something along these lines.
But you'd be dead wrong! Other than the nanotech, people have to physically switch you on first, on a peer-by-peer basis. And if they don't like your technology, or there is a more viable candidate around they will choose that technology over yours. I know I would at least.
Sure, there will still be a number of scientists using your tech in 5 to 10 or even 30 years, just like now with the many old techz - sometimes in archaic languages from the sixties - still floating around.
And you will apply all your hardly gained knowledge and experience to that other technology that became the standard (maybe IPFS), so not much is lost. If its for the best interest of the scientific community.
But it would be a damned pity, would it not? You want sow seeds that prosper even when you go for greener pastures.
Concluding: "Exposure is vital to survival"
Optimum exposure
We've established there is a minimum required amount of exposure. What about the upper bounds.. Is there a maximum? Or, better stated: What is the best amount of exposure at any specific point in time?
Well, I would argue that that should be the maximum amount you can possibly deliver successfully. Strive for maximum exposure! But that's entirely besides the point.
Yes, strive for maximum, but.. reach only the right people! So follows:
Concluding: "Optimum exposure requires targeting"
And the efforts required to get the right kind of exposure and build the right community, should put little to no unnecessary burdens or distractions to the core team. Therefore:
You could say, good exposure requires marketing, but I know that is a dirty word :)
** Responsibility**
I just recently got interested in the field of Decentralized Computing (after being annoyed how the term 'Serverless' was being hijacked by the big server moguls). And I observed following:
An entire IT field that is mostly still unbroken ground
A field where there are as of yet hardly any players
A field where there is 0 (as in zilch) generic standardization effort
A field where there are about 4 to 5 viable technology candidates for standardizing
But only to sub-domains of the field, like file sharing
And only one of these candidates has potential for more generic application. That would be Dat!
Ouch! That puts the Dat project in a unique position!
In any other IT field being a first adopter in such an empty field would lead someone to sink to the knees and thank the Big Bang, or whatever God you preach.
The special characteristics of this IT area make this position both a gift and a curse.
Given this situation can you afford to let the technology fail? Can you afford not going to the utmost in healthy competition to IPFS and others? Can you effort not jumping to grab unique selling points? Or can you afford not to broaden your horizon, and reposition yourself so that IPFS and Dat can coexist happily together?
The answer is ... yours to make
Concluding: "New technology comes with moral responsibility"
Success
I am not going to spend much time here. Success touches to all of the above. I am not going to talk about 'Failure'. The essence of my arguments should be clear by now:
The moral of the story: "Consciously craft the formula to your own success"
... ScienceFair is currently not suffering from a problem of having too few people using it. Actually we've been a victim of our own success in that I've essentially had to go dark and completely restructure my life in order to keep up with the current interest, and to plan for the future. ...
This is exactly the reason why I strongly advise to strategize for traction.
The moral: If you do your job well, traction will come, so plan for its eventual arrival date
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
(NOTE This showcase is part 3 of Positioning, vision and future direction of the Dat Project)
I wanted to address some concerns @blahah and @joehand had with regards to traction and exposure of the Dat project.
And I'd like to to this by presenting a project I know which had a similar evolution to yours: Vert.x
@blahah The Vert.x project, invented by Tim Fox, came along quite smoothly, with a steady 1.x, 2.x release chain. Tim as custodian had a very practical approach, hammering on KISS, LEAN and steady, healthy growth. Very comparable to want you want to see with Dat. Also their project organization was very similar. Things were added as needed, by people having time and opportunity. Everything fine!
But then - despite their careful approach - traction happened. People started taking note. They couldn't help it. And quickly problems started to amount. Code that shouldn't be there crept into
vertx-core
, architecture needed an overhaul, and more importantly Tim Fox - being the ultimate expert - was drowning in community support, which didn't help (I'd imagine e.g. @mafintosh already having quite a workload on these things). [Vert.x was going up quickly on Kay's 'Yikes' curve @jondashkyle ]So with community involvement they decided on a clear strategy for a total reorganization, to keep their dedicated focus, productivity, high code quality, and guarantee useful community involvement. In short, they did their job well, and even survived Tim Fox leaving the project, without as much as a hitch. The result is there for all to see.
In terms of broader exposure to the public Vert.x is still quite low-profile, being silently integrated and embedded in a number of large projects, for dedicated purposes mostly. A kind of exposure the Dat team would feel comfortable with as well, I think.
Traction and exposure
Now @joehand besides the 'How much traction can we handle? How much should we handle?' discussion.. You rightfully say 1. 'Traction leads to exposure, and exposure leads to unwanted restrictions'. An interesting subject.
And vitally important, I would argue.
So are tickling questions like these:
Let's address them in turn..
Negative exposure
@joehand wrote:
I agree. But, unless you guys really mess up, negative exposure eventually coming to your project is inevitable. Its a given. And just like asteroids, tsunami's and earthquakes you should plan in advance, make preparations so you are ready when the moment is there.
Having a stable, theoretically unstoppable software at that time just doesn't cut it, as I'll explain next.
Minimum exposure
Consider that while you go steady and slow, technology in general and encroachment of internet freedoms may well be evolving at an altogether much more rapid pace.
And with this comes the same increase in ability by government and commercial powers to control and suppress ever smaller initiatives with relative ease.
And when dark powers strike, you'll be much weaker as a small community of color-bearded ;) good-hearted dev guys, than with a broad, vibrant community with representatives from throughout society, who are willing to defend you tooth and nail.
Another point. The above doesn't matter, right? "Our decentralization magic, once stable, is like self-replicating nanotech. Virtually unstoppable once we open the vial.". You may think something along these lines.
But you'd be dead wrong! Other than the nanotech, people have to physically switch you on first, on a peer-by-peer basis. And if they don't like your technology, or there is a more viable candidate around they will choose that technology over yours. I know I would at least.
Sure, there will still be a number of scientists using your tech in 5 to 10 or even 30 years, just like now with the many old techz - sometimes in archaic languages from the sixties - still floating around.
And you will apply all your hardly gained knowledge and experience to that other technology that became the standard (maybe IPFS), so not much is lost. If its for the best interest of the scientific community.
But it would be a damned pity, would it not? You want sow seeds that prosper even when you go for greener pastures.
Optimum exposure
We've established there is a minimum required amount of exposure. What about the upper bounds.. Is there a maximum? Or, better stated: What is the best amount of exposure at any specific point in time?
Well, I would argue that that should be the maximum amount you can possibly deliver successfully. Strive for maximum exposure! But that's entirely besides the point.
Yes, strive for maximum, but.. reach only the right people! So follows:
And the efforts required to get the right kind of exposure and build the right community, should put little to no unnecessary burdens or distractions to the core team. Therefore:
You could say, good exposure requires marketing, but I know that is a dirty word :)
** Responsibility**
I just recently got interested in the field of Decentralized Computing (after being annoyed how the term 'Serverless' was being hijacked by the big server moguls). And I observed following:
Ouch! That puts the Dat project in a unique position!
In any other IT field being a first adopter in such an empty field would lead someone to sink to the knees and thank the Big Bang, or whatever God you preach.
The special characteristics of this IT area make this position both a gift and a curse.
Given this situation can you afford to let the technology fail? Can you afford not going to the utmost in healthy competition to IPFS and others? Can you effort not jumping to grab unique selling points? Or can you afford not to broaden your horizon, and reposition yourself so that IPFS and Dat can coexist happily together?
The answer is ... yours to make
Success
I am not going to spend much time here. Success touches to all of the above. I am not going to talk about 'Failure'. The essence of my arguments should be clear by now:
Next part: Vert.x as case study for good project redesign
The text was updated successfully, but these errors were encountered: