seltzered_ 3 months ago

Dave Lee (aka dave2d) had a YouTube review touching on this feature of using an ipad pro (2024) with final cut pro to record 4 streams of video and replace a streamdeck + mac + OBS setup a month ago: https://m.youtube.com/watch?v=bG2N4a0ir3A&t=626

(Just referencing, not really my area of interest)

nerdjon 3 months ago

I have done multiple recordings with this and such a small change has made a massive difference in my workflow.

Even if I am just recording a single angle, just being able to have a portable view of my camera has eliminated something being messed up with the recording like lighting, angle, etc.

Previously I would need to make a recording, send it to my Mac, check it, and hope that I don't mess something up when I record again.

That being said, it's not perfect. I am annoyed that for some reason I can't do 60fps if I am doing 4k, even though natively the Final Cut Camera app can do this.

Also annoyed that from the iPad I can't do an audio only channel with a bluetooth source and instead just have to have my iPad camera turned on and connected to my microphone. Not the end of the world but annoying since Final Cut Pro multicam itself does support audio only tracks.

It would be awesome if in a future update I could preview certain effects in real time, like green screen keying. But that would be a bonus and not a complaint.

I am sure that for more established studios this is not really useful and they already had similar functionality with far more expensive setups, but for someone doing everything just by themselves this is an amazing feature that fixes my single biggest issue with recording videos.

samspenc 3 months ago

Somewhat related to this topic, Move.AI https://www.move.ai/ does something very similar but specifically for 3D 'markerless' motion-capture (MoCap) data. You can use iPhones, smartphone or recently any inexpensive network-connectable camera to get synchronized video from multiple sources, and then use that sync'd footage and their service to get accurate MoCap animation without needing expensive MoCap suits or hardware.

Plask.ai and Rokoko Vision do something similar, but only support 1 camera afaik, so their camera-based MoCap is less accurate than Move.ai's multi-camera solution.

LiquidPolymer 3 months ago

As someone who uses Apple’s desktop Final Cut Pro app almost daily, I’ve been wondering if they would ever switch to a monthly fee as this pro version phone app does.

I think I paid around $350 eight years ago and get regular updates for free. I also have Adobe’s Premiere on my machine but I gravitate toward FCPX and wonder why I still pay Adobe monthly for that app. I might add syncing multiple cameras in FCPX is pretty easy in post production.

I’m wandering if this is a precursor for the future. Apple charging monthly for desktop apps.

  • dchuk 3 months ago

    Realistically though, your $350 one time buy was the equivalent of 5.8 years of this version of Final Cut Pro at $4.99/month. So while you using it for 8 years busts through the total cost of ownership numbers, seems like $4.99 is very reasonable monthly.

    Plus as a low monthly, it’s more likely people will try the app out and start creating videos with it, vs having to stomach and justify a $350 up front spend.

    This all being said: I’ve never understood the issue with subscription software assuming the developers keep iterating on it. Seems like that is actually BETTER alignment of usage and commercial model than one time buy, because one time buys technically don’t incentivize the developers to keep iterating on software…

    • afavour 3 months ago

      > I’ve never understood the issue with subscription software assuming the developers keep iterating on it

      That also assumes the iteration provides value to me as a customer.

      I bought a license for Sketch years ago. It expired. Thankfully Sketch provide old versions for download so I’m still on version whatever-it-is… and it does everything I need. I know there’s new functionality but I’m not really interested in it.

      • jwells89 3 months ago

        I believe that this applies to Photoshop for a lot of people and is what makes the Adobe CC subscription such bad value if PS is all you want/need.

        CS2 is perfectly adequate for my needs and it wouldn't be that bad to have to get by with CS1 or even 7.0. Most of the features added since then just make it slower and heavier with little benefit to me as a user which was bad enough with the paid releases, but with the subscription and how one has no choice but to fund further bloat makes it that much worse.

        • camillomiller 3 months ago

          To be fair, the vast majority of people that think they need Photoshop for what they do, actually don’t, especially considering how good the many, way cheaper, alternatives have become. For professionals, the Creative Cloud has many pain points but I personally would argue that the price/value we get is not as bad as many like to claim. Also, for professional the suite is a business cost that’s totally easy to deduct.

      • sbarre 3 months ago

        I think this shows that the best option should be to be able to choose.

        Use it all the time and want all the latest updates? Buy it as a subscription.

        Casual user who doesn't need all the latest updates or support? One-time purchase.

    • egypturnash 3 months ago

      You are ignoring the misalignment of user and developer needs that happens when the devs spend all their time adding new features that mean absolutely nothing to your process; in the pre-subscription days I would skip new versions of Illustrator that only added stuff for people who do text layout or whatever. Now I'm sitting here paying for them to spend an entire dev cycle fucking around with text-to-image generation garbage I have no use for, like it or not.

  • voltaireodactyl 3 months ago

    Just to put this out in the universe: I hope it stays 1-time.

  • quitit 3 months ago

    It's something I've wondered about as well. But I can see a lot of reasons why a loss leader is a good strategy here:

    - It's a trojan horse onto the platform, that'll always be worth more than a subscription.

    - New features utilise new hardware leading to upgrades, and FCP has been a useful way for Apple to show off the mac platform's speed.

    - Standardising to FCP can lead to the purchase of some of Apple's more lucrative hardware such as Afterburner cards and MacPros, or even just halo purchases as teams grow. (I consider the iPad subscription one of these halo purchases.)

    - So why not just make FCP free? They have iMovie for that - I think a free pro app would put Apple in competition hot water.

  • basisword 3 months ago

    It keeps you buying high end Macs though, which have gotten ridiculously expensive now. That alone seems like a good reason for Apple to keep investing in Final Cut and Logic.

    • dylan604 3 months ago

      I know multiple people still clinging on to their old 2012 MacPros running FCP7 as HD tape capture systems running OS X 10.6. And ONLY for that purpose. If it ain't broke, don't fix it, especially if it requires ripping out all of your existing capture devices, switching to USBC/TB4 type connectivity.

      • adamomada 3 months ago

        Yah I made a mental note when I saw that setup being used in the Tiger King Netflix show (where they show quite a bit of behind the scenes of making it). I think it stuck in my head just because it was FCP7 but it’s like a high quality power tool - it doesn’t just stop working (unless you break it hehe)

    • 42lux 3 months ago

      I honestly believe the bang for your buck is the best it has ever been for macs. The entry level hardware is also reasonable priced when looking at similar specced lin/win machines. The high end is pricey but so are HP/Lenovo workstations.

  • whstl 3 months ago

    I also wonder the same. I use Logic almost daily, but would immediately change to something else if it ever becomes subscription-based. Maybe Bitwig and Linux.

    • Goofy_Coyote 3 months ago

      I was a Premiere user, switched to Davinci Resolve a few years ago, and never looked back. I still wonder why such a great piece of software is free. There's a learning curve though - nothing serious, but I had to put in a couple of hours to learn how to do things I used to do in Adobe Premiere.

      https://www.blackmagicdesign.com/ca/products/davinciresolve

      There's a paid version too, but I doubt anyone who's not making REAL money from editing ever needs it.

      • paulryanrogers 3 months ago

        Retired a paid Vegas install for Resolve. It is quite feature packed. And the learning curve wasn't bad. I also like that it's backed by a hardware company so they aren't incentivized to gate a lot behind the paid version.

      • linsomniac 3 months ago

        >I still wonder why such a great piece of software is free.

        My impression has been that it's "a rising tide buoys all ships" sort of perspective: having more people creating drives more Black Magic hardware purchases and the upgraded software (which is quite reasonably priced).

        I will go months between making videos, having a monthly subscription is painful when I'm not using it.

    • talldayo 3 months ago

      Bitwig is so good, nowadays. I paid full-price for a Studio license, and their Linux support is so top-notch that I don't feel bad putting money in their pocket. I used to use Logic and Ableton, but I don't miss anything from them when I use Bitwig.

  • foxandmouse 3 months ago

    I think the competition in this segment will make that less less likely. davinci resolve is good enough to keep them competative, same cant be said for photoshop or illustrator alternatives.

  • Xeyz0r 3 months ago

    Especially considering that over the past decade software monetization has been shifting towards subscription models

__mharrison__ 3 months ago

Davinci resolve will quickly align multi camera shots based on audio (or other metadata) and makes it really easy to edit this type of video.

  • nerdjon 3 months ago

    Worth mentioning that this feature has existed in Final Cut Pro for some time as multicsm editing. Easily syncing up multiple video and audio sources after the fact.

    The new feature here is being able to record as multicam and the preview on iPad.

  • camillomiller 3 months ago

    Having more options and competition is good tho, no? The real deal here is the preview on device more than the editing phase tho I would say

    • diffeomorphism 3 months ago

      It would be, but this is the opposite because competition is not on equal grounds. Preinstalls are hugely important, defaults are important and even if your software is better than the manufacturer's, it will not be preinstalled and you will have less access to system functions and your software might break on OS upgrades.

      See any sherlocked app, e.g. sidecar, continuity cam,... and antitrust law suits.

    • __mharrison__ 3 months ago

      I guess that is a cool feature. I generally can't see my multicam live previews because I'm doing both the recording and being in the scene.

      A bonus of davinci resolve is that it runs on Mac, Windows, and Linux.

crb 3 months ago

The live remote camera setup seems like the bones of the only killer app idea I've ever had: a two-iPhone camera system for couples who want a stranger to take a picture of them while they're on holiday.

It would work like this:

- hand volunteer a phone and ask them to point it at you

- direct the shot by looking at the second phone ("no, stand further back, not our legs, don't point it directly at the sun")

- put the second phone in your pocket

- have the stranger take the photo you actually want, saving you rounds of back and forth when they say "Is this OK?"

Has anyone ever seen an app like this?

  • brrrrrm 3 months ago

    Apple watch has this feature

    • geerlingguy 3 months ago

      Yeah I often use my Apple Watch camera app to get a live view of what my iPhone sees when I have it on a tripod or do a family pic. It even allows for remote shutter and start stop.

      • ziofill 3 months ago

        also useful to find your phone without making it ring loudly

  • IncreasePosts 3 months ago

    Your probably be better served with a travel tripod, and then using a remote shutter (eg recent pixel phones can see you put your palm up as a sign to take the picture). I imagine a random stranger helping with a pic might get bothered after about 2 directions

  • Ukudala 3 months ago

    So dope, I have this problem all the time where I’m just like UGH can’t you take a friggen’ photo?? One without my LEGS!? Honestly I started carrying a tripod (Manfrotto carbon fiber) just so I don’t have to say thanks and later regret saying thanks because their photos this would be a killer app I would get the best shots by using them to get the best shots of me

  • kylehotchkiss 3 months ago

    Interesting... I carry my actual camera around me and I have a good understanding of the focal range and composition so I can ask the person taking my picture to step back, point up, etc. Now that's more for people I know. But not letting somebody pinch to zoom is really the most important part of not ruining a photo!

RIMR 3 months ago

Any camera with an accurate clock can record video timestamped so that the exact time of each frame can be known.

Why do we need proprietary stuff like this when we already have a tried and true method for syncing multicamera shots?

The iPhone and iPad should be perfectly capable of this. Every video you take should be compatible, and pooling videos from multiple people should allow you to sync and sequence the videos properly.

  • dylan604 3 months ago

    > when we already have a tried and true method for syncing multicamera shots?

    please inform me of these tried and true methods that allow you to take a mobile device that is not a dedicated camera and make it part of a multicam shoot. this is one of those things that shows how against status quo you might be. this kind of thing opens up multicam shoot to sooooo many more people than even BlackMagic's ATEM equipment did. People wanting to do traditional mutlticam shoots and all of the procedural nightmare that comes with that are free to continue doing it. Using something like this allows for so much more flexibility it strains credulity that you're unable to see it.

  • amelius 3 months ago

    > Why do we need proprietary stuff like this when we already have a tried and true method for syncing multicamera shots?

    Because thanks to Apple, everybody and their mother can do it.

    The question is then, why is this on HN?

ukuina 3 months ago

I tried this last weekend and it was anything but smooth. Many discovery/connection issues, and only two of the four devices sent over anything except the highly compressed "preview" stream, and they wouldn't send the final copy no matter what I tried. Had to make do with the lower quality videos to get the edit out quickly.

Bonus: FCP on iPad would keep throwing a blocking error stating it couldn't get the final streams every sixty seconds or so, for the entire duration of my editing and long after the other devices had left the vicinity. It seems the FCP project is somehow tainted by this.

Next time, I'll just record separate videos from all of them and use Resolve to sync via audio analysis to create a new multicam video.

geraldwhen 3 months ago

Syncing audio and video timing for multicam is so difficult that it generally requires onsite highly paid camera operators to manage.

I wonder what magic they used (and what underlying data format for the video) to make this sync magic happen.

  • solardev 3 months ago

    Is that still the case these days? I filmed a multi cam performance a few weeks ago with an Android phone, a PC laptop, and an iPad. The software (Premiere and others) was able to automatically sync them up just by analyzing their waveforms and lining them up. (You just drag the clips into a timeline and then right-click to sync them... it takes a few seconds to process but it's all automatic)

    It was good enough that the audio lined up perfectly except in cases where a performer was standing next to one microphone and far from the other one, which created a noticeable echo delay in stereo. But I think that's just the speed of sound at work, not the software's fault.

    My understanding is that this is a normal feature in video apps these days, to the point that a lot of older standalone time sync apps are no longer for sale because it's all built-in now. I'm not a professional in this space though, just someone recording amateur videos for a music class.

  • gorkish 3 months ago

    It's not hard; it's just kept behind a very thin veil of 'pro'

    Timecode server and recorders that sink timecode are literally all you need. Everything has been standardized for multiple decades.

    Truthfully, anymore even this is starting to become unnecessary; most NLE's can just figure it out from the regular metadata and autocorrelation to sync the clocks. Just dump it in the timeline and everything lines up.

    • geerlingguy 3 months ago

      The clocks often drift if you have more than 5-10 minutes of continuous footage though, so TC is essential. Unfortunately you have to bump up to prosumer/pro tier cameras and recorders if you want TC inputs.

      • gorkish 3 months ago

        Genlock is absolutely not needed to overcome clock skew; sync to nearest frame at cuts and resample the audio to the frame timings. Plus most modern equipment has really great clocks and I would not be surprised if it drifts less than hypothesized.

        Let's imagine we are shooting 24fps on a couple of iPhones; what is the clock error required to drift out by 1 frame over 10 minutes? The answer works out to about 70ppm. The iphone uses a MEMS oscillator with about a 5ppm error, so ... eh? Just doesn't seem likely to be a big issue for regular stuff. If you are shooting scientific data or something maybe this matters more, but you'll have to account for it in that context no matter what.

  • nerdjon 3 months ago

    Is it really? I record multiple angles with 2 iPhones and a third track with a microphone and Final Cut Pro just syncs it all up via audio syncing (before this feature came out), and handles them not having the same start point just fine.

    I have never had an issue with things not syncing properly unless there was a recording issue with one of them.

    I imagine it could be a problem with video with no audio, but that seems like a trivial thing to fix with some cheap microphones attached to your camera.

    It seems like this uses the same audio sync after the fact to get everything in sync, one time I was testing and all I did was tap the microphone and didn't say anything. Of course the phones couldn't sync with no other audio. I got an alert it was unable to sync the clips.

  • boffinAudio 3 months ago

    I wrote an app (MIKME) which does this - the apps purpose is to allow remote recording with a self-powered bluetooth microphone, so you can have "point-source" audio for your video projects - leave the microphone with the subject, but move the camera around..

    It wasn't that difficult to get audio/video in sync .. I just used AVFoundation, which has plenty of functions for dealing with the sync issue. The biggest problem was getting the recordings off the microphones filesystem to the iOS device - we had to invent a new protocol for this ("lost and found protocol"), because Apple sure don't make it easy for devices to share data .. but once the files are copied, the sync is the easy part.

  • zhengyi13 3 months ago

    Dumb/ignorant question:

    I'm under impression that the "clack" used at the beginning of the stereotypical film take functions as a sound spike that multiple reels of film from different angles can be synchronized on - assuming I'm not way off base here, what about syncing audio/video on modern digital devices is more difficult?

    • dylan604 3 months ago

      It's a different type of sync than that. Syncing multicameras would ensure the scanning of each camera is aligned so that the start of a frame is at the same time on each camera. Traditionally, you needed cameras that had the ability to accept a sync reference signal provided from a sync generator by running a dedicated cable to each camera as well as all of the other video equipment like switchers, Chyron, VTRs, etc. In the bad old days, if you tried to switch between two cameras that were not in sync, you'd get a very unclean cut as the H/V sync between sources were not aligned.

      Today, switching equipment just "re-stripes" the timing from each input. The need for a house reference is not necessary with modern prosumer gear. For broadcast still supporting interlaced HD, they still run that reference cable.

    • numpad0 3 months ago

      Clapperboards are for syncing video and audio for a single camera. They don't always use the audio recorded by the video camera itself, so means to synchronize the two is/was needed.

  • dylan604 3 months ago

    Camera tech and syncing has become almost trivial. You can do shoots with a mix of cameras all feeding back to a central place without any kind of sync between them all without running any kind of sync.

  • Ajedi32 3 months ago

    When you think about it's actually pretty ridiculous that this difficult in the first place, in the internet era. In principle all you really need is a standard way to tag each keyframe with an NTP-synced timestamp, and any camera with an internet connection would be able to achieve this effortlessly.

    • fwip 3 months ago

      NTP time skew can be on the order of 100ms. If the audio/video is desynced by 30ms, it can definitely be noticeable. Some people are more sensitive than others, and it also depends on the particulars of the content.

      Edit: Another commenter points out that in some situations, it's even necessary to align the time that the frames start, which necessarily requires sub-frame precision.

      • Ajedi32 3 months ago

        It's almost never going to be that bad unless you're going out over a terrible internet connection. Worst case you could have a way to configure one camera to run its own NTP server and have the others sync to it over LAN. That would get you sub-frame (<1ms) precision easily.

    • ricktdotorg 3 months ago

      this drove me insane back in ~2004/2005 as a longtime-NTP-loving sysadmin who was doing tech for an indie film company. they were ingesting tons of multicam MiniDV into FCP and their timecode discipline was NOT good. there was no simple answer to this back then, pure manual work. truly is kind of bonkers it has taken until ~2024 to solve this (kinda).

    • solardev 3 months ago

      Is NTP precise enough for this? I thought timestamps are usually in milliseconds, but audio sampling rates can be fractions of a millisecond? If you're off by a little, won't it create weird echoes or interference in the sound ?

      • dghlsakjg 3 months ago

        Sound travels at 1 foot per millisecond roughly (standard atmospheric conditions).

        So a soundfeed with multiple microphones several feet apart already have this issue.

        • solardev 3 months ago

          But that's actually part of the signal (useful information), isn't it, like for stereo or spatial recording setups? Presumably each performer or instrument's delay is a function of its distance to different microphones.

          vs NTP-introduced errors, which are just noise and not part of the intentional recording?

    • adamomada 3 months ago

      Oh c’mon there are people who have thought about it for more than the ten seconds you have and it’s not feasible and not effortless - you want to give every cam an internet gateway??

    • milleramp 3 months ago

      Modern cameras are using PTP for synchronization.

  • btgeekboy 3 months ago

    I admit I have no idea how it's actually working, but Apple does have experience keeping audio in sync: AirPods. I suspect video's a similar process.

dharma1 3 months ago

I wish Resolve could record multiple video/audio input sources at once - like a video DAW.

4K HDMI -> USB capture devices are so cheap these days. Yes you can record on the cameras, transfer and sync but it’s friction

  • flutas 3 months ago

    > 4K HDMI -> USB capture devices are so cheap these days.

    FWIW: Most of these cheaper devices will actually be ~720p max upscaled.

    https://www.naut.ca/blog/2020/07/09/cheap-hdmi-capture-card-...

    • sho 3 months ago

      Yeah, the cheapest good-ish solution is the ~$100 elgato camlink 4K, which will give you 4K 30fps in. It works well enough. Skip the really cheap crap, as you say.

  • geerlingguy 3 months ago

    Right now the best solution is using hardware like the Atomos Sumo, if you want multiple sources and full quality. I wish there were an easy breakout for 2-4 HDMI or SDI inputs straight through to a multi cam timeline on my Mac.

  • sho 3 months ago

    Yeah, I got back into video recently and it's pretty surprising that the "multiple video in" isn't a solved problem in 2024! Even recording "clean" video by itself is unexpectedly tricky. The only choice, on a mac at least, appears to be to record directly from the USB device using quicktime.

    Seems like a definitely opportunity for someone, although it also looks like Apple are thinking of moving into the space so maybe not.

    • Mashimo 3 months ago

      > Even recording "clean" video by itself is unexpectedly tricky.

      VLC, OBS, ffmpeg (and webcam app on windows) should all be able to do it.

      • sho 3 months ago

        I ran into so many obscure, incomprehensible issues with OBS I just gave up. I'm sure the other two can also be coerced into doing what you need but Quicktime just works. Maybe things are different on PC.

suyash 3 months ago

Anyone know similar setup and app for Android device?

dchuk 3 months ago

Anyone know if you can combine this multi cam feature with an external camera source connected to the iPad? So let’s say you have a canon camera hooked up by usbc to Final Cut and that’s your iPad’s camera, and then you have an iPhone as your second camera? If so, this can really be quite the portable high end recording solution for podcasters and YouTubers…

lsy 3 months ago

Is there a streaming-native way of encoding multiple camera angles simultaneously in a video? The ability to toggle between different angles is a feature on DVDs and Blu-rays that IMO is underused but awesome for educational videos where different aspects or angles of a demonstration might be useful.

  • jimbobthrowawy 3 months ago

    Youtube has multi-stream capability and I've seen one streamer use it years ago to let you pick which multiplayer user to spectate. I assume they used separate streams inside in the .m3u8 file used for HLS, much like how you'd do different resolutions. Haven't seen it done in years.

    .mkv files natively support adding an arbitrary number of video tracks, though player support is kind of rare. (it's _ by default to cycle video tracks in mpv)

    • rzzzt 3 months ago

      MP4 containers also support multiple video tracks.

  • Goofy_Coyote 3 months ago

    On Mac, I'm using Continuity Camera with an older extra iPhone I have + Elgato Cam Link to hook in a DSLR. Then I use OBS Studio to mash them all together, you can then use some plugins to switch between cameras live while streaming (or recording).

    Hope it helps

0x70dd 3 months ago

iPhones record videos with variable frame rate, which makes them unsuitable for syncing the video with a separate recorded audio track (e.g. when recording guitars). The video and audio would ever so slightly go our of sync. I wish there was a way to use fixed frame rate when recording.

  • GeekyBear 3 months ago

    As covered in the article, the free Final Cut Camera app gives you full control of the camera settings, including fixed frame rates.

edandersen 3 months ago

Why can't this work with Final Cut Pro on a Mac?

kragen 3 months ago

possibly if you are interested in doing this sort of thing, you will be interested in doing it with free software. in that case:

the debian debconf video team has been doing this for something like 15 years without the iphones and ipads, and without a limit of 4 camera angles, with 100% free software, and supporting live streaming (with simultaneous recoding to various bit rates) as well as just recording. they can fix unfortunate choices of camera stream after the fact for the final archived videos, too

they document their setup at https://debconf-video-team.pages.debian.net/docs/index.html. the live video mixing software (to choose which camera angle to stream at any given time) is voctomix https://github.com/voc/voctomix which belongs to the ccc video operation center https://c3voc.de/

i don't actually know what they use to push the rtmp video streams to the streaming server, but rtmp is very widely supported, since it's what twitch and youtube use for streaming video. obs studio can do this and is probably the most popular software for doing it; it's also 100% free software. https://www.digitalocean.com/community/tutorials/how-to-set-... says that you can also use ffmpeg directly

    ffmpeg -re -i "Introducing App Platform by DigitalOcean-iom_nhYQIYk.mp4" -c:v copy -c:a aac -ar 44100 -ac 1 -f flv rtmp://localhost/live/stream
https://trac.ffmpeg.org/wiki/Capture/Webcam#Encodingexample1 suggests getting ffmpeg input from your webcam with

    ffmpeg -f v4l2 -framerate 25 -video_size 640x480 -i /dev/video0
and i can successfully test my webcam with just

    ffmpeg -i /dev/video0 -pix_fmt yuv420p -f sdl 'hi'
but i think that integrating an audio stream is a little more hassle if you want to try doing it manually with ffmpeg command lines (details in https://trac.ffmpeg.org/wiki/StreamingGuide and https://www.baeldung.com/linux/ffmpeg-webcam-stream-video). if you want a gui just use obs studio
  • snailmailman 3 months ago

    In a similar vein, I've used VDO.ninja several times to easily stream an iOS device camera into OBS. it just throws the camera stream onto a webpage, and then I load the page in OBS. not zero latency but its "good enough" for me to use my phone as a simple extra webcam. works on basically anything that has a web browser + camera.

    • kragen 3 months ago

      ooh, thanks! is vdo.ninja proprietary? if so, maybe something like mediasoup would enable a self-hosted version?

      i like the idea of joining half a dozen cellphones to a jitsi chat or similar, then recording all of the streams and allowing on-the-fly stream selection to change camera angles for the live stream

      • snailmailman 3 months ago

        It’s self-hostable. Although im not seeing the link to their GitHub on the main vdo.ninja site. It’s linked to from their docs page though.

        https://github.com/steveseguin/vdo.ninja

        I think the streaming happens P2P either way. So if you don’t self host it I don’t think you incur a latency hit. It should route the video straight from phone->PC. It doesn’t route through their server as far as I know. (Edit: I stand corrected. Their docs mention an encrypted TURN server may get used if a direct connection isn’t possible)

        I do use the self hosted version anyway. Was pretty easy to setup.

        • kragen 3 months ago

          agpl, sweet!

twobitshifter 3 months ago

do you think final cut camera will support pro photography or will there be a pro camera app?