デイビッド

Hi! My name is David da Silva.

I made this website to show-off my design skills display the case studies of some of my favourite projects / experiencies. It is by far not as complete as I'd like, but I don't have time to work on it right now.

Some of my passions are:

I have working experience as:

Which doesn't say much about what I did at the place, I know. You can find out a bit more on LinkedIn, on my resume, or just ask me directly! :)

Datasets of my persona: TwitterInstagramTwitchYouTubee-mail

Gamedev / Product Design projects

These are some of my most treasured projects or experiences. Sadly, I haven't had the time to write a proper description for each of them, nor to add all of them here, but I can def chat about them if you ask me!

DementiaCare
Product Design & System Architecture
2nd Prize HackMed 2018

HackMed 2018 has been my favorite hackathon. I went alone and teamed up with the people I shared a table with, which ended up being exactly the kind of people I was looking for: lovely people from diverse backgrounds.

Description WIP.

#artpluscode: programmatic art experiments

Generated using JavaScript and HTML5 Canvas. I publish them first on my Instagram account, and sometimes I stream their creation on Twitch and upload the recordings to YouTube. All the code is released on GitHub.

The #artpluscode wall website is probably the best way to view them, but it doesn't have every single piece released on Instagram.

LifeSaber
Product Design
1st Prize PennApps 2015 Winter (~1300 participants)

Every year around 600,000 people are going to have a cardiac arrest, heart disease alone is the leading cause of death in the US, claiming 600,000 lives per year. The chances of surviving are 4x if a first responder is in the scene, but only ~20% of the people know what to do.

LifeSaber is an Android and Android-wear app that solves this problem, by making anyone a first responder and life saver, doing the following:

We used UPenn's MyHeartMap AED database and dumped it into MongoDB to get the nearest defibrilators to a person.

Making lifesaver vibrate at the correct bpm for CPR is incredibly genius. #PennApps

— Steph Hippo (@stephhippo) January 18, 2015

Lifesaber is a really solid health hack. http://t.co/uaJ4DHGNpY #pennapps

— Nick DiRienzo (@nickdirienzo) January 18, 2015

Wow. That was a tough decision. Congrats to Lifesaber, Curiosity, and Fruit Ninja - VR style. @PennApps

— Corey Farrell (@THEFOAGUY) January 18, 2015

SpatialOS Networking Design Guides
Research, Prototyping, Copywriting, & User Validation
Improbable

EscapeRoom
Interaction Design & Prototyping
START Hack 2016

Long story short, Logitech put at our disposal a Tobbi Eye Tracker. I started thinking about new interactions in games that could use them, or existing ones that could be improved through its use.

I had recently been playing Tomb Raider on the Xbox ONE, and I remembered these two aspects of the game that left me wishing for a better experience:

See, the problem here is capturing the player's intent. How do I discern between the player walking next to the bottle just because the bottle is on their way, or because the player wants to interact with the bottle? How do I confirm to the player that they can indeed interact with the item, without giving away that it's an interactable item when the player had no intention of interacting with it? This is specially hard in a 3rd person controller-based game – first-person games can at least get away with using the center of the screen as a pointer.

The relic puzzle has a similar problem: even if the engraved detail is visible on the screen, even on its center, it doesn't mean that the player has noticed it. A simple addition like making the player press a button when the detail is right in the center of the screen would have avoided this problem.

Back to the hackathon, these two aspects were perfect test candidates for being improved with an Eye Tracker, which would be used as a pointing device to understand the player's intent: if the player is next to the bottle, and their eyes are pointing to the bottle, it is very likely that they want to interact with the bottle, so bring up that context menu. In the relic puzzle, if they fixate their eyes on the engraved text, that means that they have noticed it.

Sounds great and exciting, right? Sadly, we couldn't get much working at the hackathon. It was a 24h one, we aimed too big instead of focusing and making incremental progress, we had hardware issues with our laptops and the Leap Motion (with which I wanted to be able to rotate the relic using my hand), I needed a lot of sleep time, and we didn't organize too well (integrating like 4 parallel Unity projects, hello darkness my old friend).

Overcharge, GGJ18
Game Design, Thruster UI, and Networking Code

The theme was "Transmission". Inspiration:

"Multiplayer Online Game Development" Course
Everything
Sixth Edition

This June 25-29 I'll be holding in Barcelona the Sixth Edition of the course. You can sign up and find more info here. And if you know of anyone that could be interested, please, let them know! I would have loved to attend a course like this one when I was younger, personally – it would have propelled me so much.

Here's a short summary of the contents (as seen on the poster):

JavaScript & Canvas & Spritesheets & Node.js & Socket.io & Deployment & NPC & Accounts & Chat & Leaderboards & Persistence & Client-side v.s. Server-side Authority & Client-side prediction & Server reconciliation & Dead Reckoning & Interpolation & Lockstep & Simulation Rollback & Uniform Grids.

Physics-based Sonic Riders
Everything but the 3D assets and music/sfx

My first solo Unity project, a homage to one of my favorite games. The controls are very hard, similar to drone racing (yaw, pitch & roll).

I worked on this in preparation for Improbable's interview – I had barely touched Unity before, and an assigment required modifications to a Unity project. During the internship, I quickly added multiplayer support using Improbable's SpatialOS, to test how it was to integrate the SDK into an existing game.

Fun bug I had: I made the jump force depend on `deltaTime`, so I sometimes had radically different jump heights (playing on a laptop contributed). Through this I learnt that one-off forces should not depend on time.

Wrote this AI in < 1h using ML (slightly custom kNN). It is trained by observing a real player drive, and trying to replicate what the player would do in the same situation (stores memories as [pos, vel, inputs], during execution finds closest memory to situation, and mimics). pic.twitter.com/o1qIM8Q7yz

— David da Silva (@dasilvacontin) June 5, 2018

Reads 2018

Active (progress)

Inactive (progress)

Completed

Backlog

Exhibitions & Events 2018

Attended

Movies 2018

Seen

To-See

lol

Think there's a book / event / movie that I would really enjoy? Let me know!


#artpluscode #17. View more on my Instagram.