|Discussion Home | About | Threads By Date | Search|
I have created an OSMF Pseudostreaming Plugin: https://github.com/mexxik/osmf-pseudostreaming
Maybe somebody could use it
Encrypted HLS Stream, HTTPNetStream, HTTPStreamSource
Hi, I'm trying to find anyone that might have had experience working with an enrypted HLS stream in OSMF.
I've been using the library at http://code.google.com/p/apple-http-osmf/ and managed to get this working with a non-encrypted stream.
Also if anyone has any in depth knowledge of the HTTPNetStream and HTTPStreamSource classes it would be helpful to get in touch with you also.
Hi OSMF Experts,
I've tried to duplicate segments of a Video with BitmapData.draw.
I've used OSMF 2.0 for the Tests. And I've testet with the sample.mp4/manifest.f4m example (see links).
Thanks for your answers.
Help Needed Posting a Position Looking for OSMF Developer
I would like to post a position but can't find out where the posting link is. I am looking for an OSMF Developer for a contract position in San Jose- can anyone help me as to where I post this? Thanks!
I'v started to add HTTP Dynamic Streaming support into FFMPEG. In the process I found something very odd and would appreciate any feedback or insight. I started to stripping atoms out of the fragments generated by the adobe fragment tool. I did this to find the initial minimum atoms used in the OSMF to play a fragment stream. I followed adobes F4V doc and read what as manidotry and what wasn't, but I wanted to test it against the OSMF implementation. What I discovered was that I stripped every atom, except the MDAT atom, and OSMF was still able to play the fragments. I'm confused because I thought the TRAF atoms held the location and timing of each from with in the fragment file. So how is OSMF able to play the file. My only thought was that the data in the MDAT isnt' just encoded video/audio. But there is some addition container data in there. Any ideas?
How to direct use org.osmf.net.httpstreaming.HTTPNetStream
I want to develop a http live stream player,but I don't want use MediaPlayer or MediaPlayerSprite calss in osmf framework,I want use netconnection and netstream as start,may I use HTTPNetStream class directly.if can,how?Thanks!
Hi I am an industrial designer by trade.
I am adding interaction design to my skills. I am working on an assignment
which is an air app that can display and convert a 3d model to a collada model.
I am a programming newbie so I am not able to handle the errors that come up.
I was hoping I could get help from some seasoned users.
I created an app in Air Launchpad but as I add things to the code I get errors.
I am using the code in this project to view the model and manipulate it.
But I cant even run that because of this error:
/Applications/Adobe Flash Builder 4.5/sdks/4.5.1/runtimes/air/mac" "/Users/remel/Documents/
Adobe Flash Builder 4.5/ME580x/FinalProject/ColladaAirViewer/bin-debug/ColladaAirViewer-app.xml" "/
Users/remel/Documents/Adobe Flash Builder 4.5/ME580x/FinalProject/ColladaAirViewer/bin-debug"
Thanks for your help.
Hello I have little problem. I want to know that stream was 'reconnected' (using OSMF 1.6).
user 1 - publishes video from camera
user 2 - watches the video
if user 1 stops recording or disconnects from steram user 2 will always recieve state playbackError. This is ok (I know that steram isn't working).
But when user 1 starts recording again or connects to stream I get state ready only sometimes and sometimes not.
I am not sure if it's OSMF bug or some configuration bug.
I need a way to tell that stream is up and running again.
Does anyone know if the standard specfiications for F4M or M3U support start and stop times? I am looking to be able to tell strobe where to start and stop a video. I understand that you can do this with flashvars and yes I could read the information and inject that information at run time on to the page. But I would like to ideally include this information either in the F4M or M3U standards that Strobe supports.
Thank you for the help.
Filter by Date
Filter by TagAll