Skip to main content

Joost.com aka theveniceproject

Back in 2006 after visiting Fosdem and seeing Alexander Fritze demoing Zap (a voiceover ip phone using Mozilla technology developed by 8x8). I tried to make it work on Mac OS X which was at the time my OS of choice (it was way less in use by developers than it is today). I was affected by personal issues, so didn't really do anything about that until the end of June of that year. Finally, on a Friday night, I managed to add the proper ld flag to have audio in and posted something on the zap mailing list.

 


 

 

I got an answer from the project manager at 8x8 giving me FTP write access, so I could upload my build. I don't remember if I send a patch too.

Furthermore, I got another interesting email from Alex telling me he had forgotten to tell me something at FOSDEM, but could not really tell me more. To which I replied, give me a NDA and I'll sign it. Two hours later, I had an NDA to sign, that was very vague. The next day, a Sunday, I was having a very noisy conversation over skype with my CTO to be. I had signed but not sent said NDA. The conversation led me to fly to the Netherlands to do a bunch of interviews the week after - and having to remind my current boss that he could not prevent me from going because I had given my resignation later already.

In Schiphol, I met Alex and his wife, on the train leading to Leiden where the project was based. We discussed a bit, and they wanted me to work on the MacOS X port.

I was a dev by night at the time, mostly on Camino. My main day job was based around closed source monitoring software. Porting to MacOS X felt way out of my league and I was scared by the task. I felt way more competent and felt way better at testing, as I was testing a lot of software. So I was bold and talked about QA and not being that confident about writing code myself.

The venice project (TVP) was a video player over the internet. Users could watch videos. YouTube was one year old. The only alternatives to watching videos were to use real player or QuickTime. But there was no website offering a catalog of videos. TVP offered a player initially for windows. It was an embedded Firefox using FFmpeg for video playback and the library used both by skype and Kaaza for p2p delivery. Later on we released a MacOS client, and never went the route of publishing a Linux client, even though we used it internally.

I ended up in the client room with Alex and Alan, so I could ask questions easily. The P2P team was sitting in another room on the other side of the hallway. I moved away to my own room when Jonathan came back from vacation. For the first 3 or 4 months we were not streaming, so the work being done was around the UI, and playing properly videos. Reasons were another story of 101 Schipholweg was working on transcoding videos to the proper format. That meant deciding what software to use, what settings were best with the projected bandwidth. The P2P team was working on adapting the code to our needs. So I started to watch a video about sharks (That video was embedded into the early version of the player) over and over and over again and learning the job of a QA engineer, I had books to read on methodology, I started building a lab with different hardware to catch potential bugs - related to drivers, I also had my private ADSL line to be able to mimic normal users. I was setting up various Windows machine and at least twice forgot to install sound drivers, which made the client not work at all.

I didn't follow backend work that closely, as there was already a lot of work to do on the client side. But the backend needed to implement at least the following :

  • transcoding (as explained above)
  • Initial seed of the video (eg loading it on our server infrastructure that was also being built)
  • search
  • metadata (what is xxxxx.mpg about)
  • reporting
  • ads

The process looked like this Rights acquisition (making a deal with the content owner) -> video sent to us (in various means, mostly HDD) -> transcoding + backend feeding (getting the metadata into our system for people to find and watch the video) -> seeding in our DC (I'd guess using scp)

Content acquisition was done by some 'sales team' and they had to work against the images of Kaaza and our founders. So content at the beginning, early 2007, was sparse, very sparse, and content providers were afraid that the content would end up being pirated using our infrastructure. Content provider were using the same rules as TV, so content was very, very much geo-locked (available for continental Europe excluding Portugal was one that stroke me most, because the on one update most of the content was like that). This caused issues because geo restrictions had to be implemented both in the backend and client. We finally manged, but it took us a long while to have it working properly.

 All of this from scratch. And make this work with both the P2P team and the client team. On the client side of things, UI was focused on two things, at least in the beginning : content discovery and social interaction (remember no Twitter/X nor Facebook yet). There was a chat option using XMPP in the background.

Also, we needed to set up our global infrastructure. For non-technical reasons, the first location for a datacenter was Luxembourg. This lead to a rather scarce choice of DC providers. We had to choose the OS we would be running on our DC. Yes, we also had to build our infrastructure from scratch (AWS was starting back then). Most of our hardware was sun x86 based, and the choice ended up being Debian with puppet to manage our infra, over Solaris. (We still had some Solaris internal for development :))

Testing was internal, and we started streaming and trying different strategies for content discovery. As content wasn't that large, we tried a bunch of method. On the same side we started to add some social salt to our product, where people could tag, add comments rate videos. That meant we had to enrich our metadata, and we would be able to use it for content discovery later. We could also chat live (using xmpp while watching a show). I remember being pulled in to demo that capability while the sales person (for ads, this time) was demoing our platform to Coca-Cola.

At some point when distributing P2P software, you need to test the network in real conditions. This means getting users to use your product. So we opened a private beta - the hype was high and recruiting users was easy. We found some bugs in the network stack, some UI ones and feedback on the metadata and searchability. Our APIs, although not documented, were used by some to provide better content discovery than we were offering. We added some obfuscation in order to protect the P2P stack, that meant issues with AV software, that also meant less testing as it was only available for external public builds.

Mays 1st 2007 was picked as our soft launch date. This coincided with the beginning of Apachecon Europe. Most of the engineering team, being based in Leiden, decide to also attend said conference. Backend had not been stress tested nor load tested. The gate opened around 10 AM I believe, and our backend died before 10:10. Engineers were on a never fixing end to make our backend work, for the users, where anticipation was great (remember YouTube was not a thing at that time) the experience felt miserable. We worked as a team on the poor Apache co Wi-Fi, trying to fix the issues, but testing was hard as bandwidth was scarce. We managed to stabilize the situation and had something that wasn't going down very twenty minutes. But for users, it didn't look good.

 We worked our ass off to have a second soft launch mid-August if memory serves me correctly. We reworked how metadata and content were managed.  Worked paid off and load testing worked, we were ready for the public. Unfortunately, the hype was gone and, we did get users, but they didn't stay because we still had issues with content, and it's discovery.

 We got rid of most of our technical issue; This is when the company pivoted to be more sales and marketing centric :

  • we send plenty of email to regain users (some of us called that spamming users)
  • we forced user registration to access any kind of content, as ads would sell better
  • we started accepting very shitty content in order to have more content.

None of this solved the issue, the hype was gone, YouTube was the place to go for videos.

We then pivoted to a browser plugin to boost adoption - but that didn't work out.

We moved to flash - without P2P and that didn't help either.

We had an iPhone app - that was a fun one to QA, but it didn't boost or view ship. Nor did the live stream we did for the NBA.

As we had stopped using Mozilla, I changed jobs

Comments

Popular posts from this blog

Key signing party at fosdem 2024

I'm organizing a GnuPG key signing party in order to bolster our web of trust , since there is no official ksp this year. I have organized a few in the past using tools like biglumber (website is gone, if someone know of a replacement or where the source code of site is, I might end up running one again) and others tools . I've also run once the KSP at FOSDEM and helped running it a few other times.    === Details below === When, Where   We'll meet in front of the infodesk stand in building K around 12:00 Sunday Feb 4th 2024. I'll have a sing of some sort with KSP and or Key Signing Party . Once enough participants show up we will move outside to proceed with the party. What to Bring Warm cloths as the party will happen outside this year, like in the good old days. I hope it won't rain, but it might. Piece of papers with your fingerprint written on them. Each piece should look like below:  $ gpg --fingerprint 34246C61F792FBCC1F23BFF296BD1F38FEA32B4D pub ...

Gandi , je suis énervé, la partie email

 En mai dernier j'avais migré le service de messagerie de mon domaine depuis google (offre historique datant de 2006) gratuite vers le service de Gandi. J'avais les contraintes suivantes : un peu d'espace, on passait de 15GB/utilisateurs tentative de reprise de l'historique Neuf utilisateurs calendrier+ carnet d'adresse du webmail pop/imap/smtp avec du TLS dedans Ayant un domaine historique, j'avais droit à cinq boites gratuites. Mais, je devais m'acquitter du service supplémentaire jusqu'à la prochaine échéance de renouvellement du domaine, soit 2025. Avec des boites mail à 3Gb, cela faisait une facture de 400€. Avec les dernières annonces de Gandi, à partir de la fin d'année, il me faudrait débourser en sus 25€/mois pour garder un service équivalent et à partir de 2025 45€/mois. La très mauvaise surprise du mercredi. J'ai donc commencé à chercher des alternatives, sachant qu'à terme, j'aurais au moins deux utilisateurs de plus. Point p...

En colère contre les services de l'état

 Ma femme n'est pas française, mais elle souhaite le devenir. Elle a lu ce que demande l'administration demande. Elle a après 2 mois tous les papiers de son pays natal. Ils ont été traduits par une personne reconnue des services de l'État et les apostilles ont été posées. Il ne reste donc que deux choses à faire : Demander les papiers français (certificat de mariage, preuve du niveau de langue, actes de naissances des enfants) . Prendre rendez-vous en préfecture pour déposer son dossier. Les papiers français manquants ne poseront pas de problème ça prendra entre trois semaines et un mois pour les recevoir. Nous habitons le Tarn & Garonne, pour les dossiers de demande de naturalisation les demandes de tous les départements de midi-Pyrénées, sont centralisées à la préfecture de la Haute-Garonne, et ce, depuis un certain temps déjà (depuis le 29 avril 2015 très exactement) . Ma femme m'indique donc que les prises de rendez-vous s'effectuent uniquement ...