Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add guidance for defining a new source of MediaStreamTrack #988

Merged
merged 9 commits into from
Feb 22, 2024
Merged
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
79 changes: 57 additions & 22 deletions getusermedia.html
Original file line number Diff line number Diff line change
Expand Up @@ -847,13 +847,17 @@ <h4>Life-cycle</h4>
<p>If all {{MediaStreamTrack}}s that are using the same
source are [= track/ended =], the source will be
[= source/stopped =].</p>
<p>When a {{MediaStreamTrack}} object ends for any
reason (e.g., because the user rescinds the permission for the page to
use the local camera, or because the application invoked the
{{MediaStreamTrack/stop()}} method on
the {{MediaStreamTrack}} object, or because the User
Agent has instructed the track to end for any reason) it is said to be
<p>When the application has invoked the {{MediaStreamTrack/stop()}}
method on a {{MediaStreamTrack}} object, or when the [=source=] of a
{{MediaStreamTrack}} ends production of live samples to its tracks,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for clarity, I think this should be "permanently ends production". If production can resume, the source is muted, not ended.

alvestrand marked this conversation as resolved.
Show resolved Hide resolved
whichever is sooner, a {{MediaStreamTrack}} is said to be
<dfn id="track-ended" data-dfn-for="track" data-export>ended</dfn>.</p>
<p>For camera and microphone sources, the reasons for a source to end
[=ended|end=] its tracks for reasons other than
{{MediaStreamTrack/stop()}} are [=implementation-defined=]
(e.g., because the user rescinds the permission for the page to
use the local camera, or because the User
Agent has instructed the track to end for any reason).</p>
<p>When a {{MediaStreamTrack}} <var>track</var>
<dfn data-lt="track ended by the User agent" data-for=MediaStreamTrack data-export id="ends-nostop">ends for any reason other than the {{MediaStreamTrack/stop()}} method being
invoked</dfn>, the [=User Agent=] MUST queue a task that runs the following
Expand Down Expand Up @@ -899,12 +903,15 @@ <h4>Media Flow</h4>
muted, and enabled / disabled.</p>
<p><dfn data-export id=
"track-muted">Muted</dfn> refers to the input to the
{{MediaStreamTrack}}. If live samples are not made
available to the {{MediaStreamTrack}} it is muted.</p>
{{MediaStreamTrack}}. Live samples MUST NOT be made available to a
{{MediaStreamTrack}} while it is [=muted=].</p>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

made available to track sinks?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to minimize language changes in this PR, and this seems orthogonal.

Samples are observable through track.stats as well, so I think it's incorrect to say it only affects sinks. I think the existing language covers the track AND sinks, but if you disagree, please open a separate issue to clarify.

<p>{{Muted}} is outside the control of web applications, but can be observed by
the application by reading the {{MediaStreamTrack/muted}} attribute and listening
to the associated events {{mute}} and {{unmute}}. There can be
several reasons for a {{MediaStreamTrack}} to be muted:
to the associated events {{mute}} and {{unmute}}. The reasons for a
{{MediaStreamTrack}} to be muted are defined by its <a>source</a>.</p>
<p>For camera and microphone sources, the reasons to [=muted|mute=] are
[=implementation-defined=]. This allows user agents to implement privacy
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is a reasonable way to phrase it.

mitigations in situations like:
the user pushing a physical mute button on the microphone, the user
closing a laptop lid with an embedded camera, the user toggling a
control in the operating system, the user clicking a mute button in the
Expand All @@ -915,9 +922,13 @@ <h4>Media Flow</h4>
this information to the web application through {{MediaStreamTrack/muted}} and
its associated events.</p>

<p>Whenever the [=User Agent=] initiates such a change, it MUST queue a
<p>Whenever the [=User Agent=] initiates such an [= implementation-defined=]
change for camera or microphone sources, it MUST queue a
task, using the user interaction task source, to [=set a track's muted
state=] to the state desired by the user.</p>
<div class="note">This does not apply to [=source|sources=] defined in
other specifications. Other specifications need to define their own steps
to [=set a track's muted state=] if desired.</div>
<p>To <dfn class="abstract-op" id="set-track-muted">set a track's muted state</dfn> to
<var>newState</var>, the [=User Agent=] MUST run the following steps:</p>
<ol class="algorithm">
Expand Down Expand Up @@ -5671,8 +5682,7 @@ <h2>Extensibility</h2>
specification may be extended. Two likely extension points are defining a
new media type and defining a new constrainable property.</p>
<section>
<h2>Defining a new media type (beyond the existing Audio and Video
types)</h2>
<h2>Defining a new {{MediaStreamTrack/kind}} of media (beyond audio and video)</h2>
<p>At a minimum, defining a new media type would require</p>
<ul>
<li>adding a new getXXXXTracks() method for the type to the
Expand All @@ -5685,7 +5695,7 @@ <h2>Defining a new media type (beyond the existing Audio and Video
the {{MediaStreamTrack}} interface,</li>
<li>defining any constrainable properties (see <a href=
"#constrainable-properties"></a>) that are applicable to the media
type,
type for each <a>source</a>,
</li>
<li>updating how the {{HTMLMediaElement}} works with a
{{MediaStream}} containing a track of the new media type
Expand Down Expand Up @@ -5750,21 +5760,46 @@ <h2>Defining a new constrainable property</h2>
Future versions of this specification and others created by the WebRTC Working Group will take into consideration all extensions they are
aware of in an attempt to reduce potential usage conflicts.</p>
</section>
<p>&nbsp;</p>
<p>It is also likely that new consumers of {{MediaStream}}s
or {{MediaStreamTrack}}s will be defined in the future. The
following section provides guidance.</p>
<section>
<h2>Defining new consumers of {{MediaStream}}s and {{MediaStreamTrack}}s</h2>
<p>At a minimum, any new consumer of a
{{MediaStreamTrack}} will need to define</p>
<h2>Defining a new sink for {{MediaStreamTrack}} and {{MediaStream}}</h2>
<p>It is likely that new sinks for {{MediaStream}} and/or {{MediaStreamTrack}}
will be defined in the future. At a minimum, a new consumer of a
jan-ivar marked this conversation as resolved.
Show resolved Hide resolved
{{MediaStreamTrack}} will need to define:</p>
<ul>
<li>how a {{MediaStreamTrack}} will render in the
<li>how a {{MediaStreamTrack}} will be consumed in the
various states in which it can be, including muted and disabled (see
[[[#life-cycle-and-media-flow]]]).
</li>
</ul>
</section>
<section>
<h2>Defining a new <a>source</a> of {{MediaStreamTrack}}</h2>
<p>It is likely that new sources of {{MediaStreamTrack}}
will be defined in the future. At a minimum, a new source of
jan-ivar marked this conversation as resolved.
Show resolved Hide resolved
{{MediaStreamTrack}} will need to</p>
<ul>
<li>define a new API to [=create a MediaStreamTrack=] of the
relevant {{MediaStreamTrack/kind}}s from this new <a>source</a>
({{MediaDevices/getUserMedia()}} is dedicated to camera and microphone sources),
</li>
<li>declare which constrainable properties (see <a href=
"#constrainable-properties"></a>), if any, are applicable to each
{{MediaStreamTrack/kind}} of media this new <a>source</a> produces,
and how they work with this source,
</li>
<li>describe how and when to [=set a track's muted state=] for this
<a>source</a>,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it clear enough that, if extension spec is not defining how/when muted changes, then muted will never change for tracks of the given source?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also need to define ended event and what the initial source muted state should be.

Copy link
Member Author

@jan-ivar jan-ivar Feb 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also file issues on other specs to define mute, initial mute and ended.

</li>
jan-ivar marked this conversation as resolved.
Show resolved Hide resolved
<li>if capture of the source is a [=powerful feature=] requiring
[=express permission=], describe its
<a href="#permissions-integration">permissions integration</a> and
<a href="#permissions-policy-integration">permissions policy integration</a>,
</li>
<li>if capture of the source poses a privacy concern, describe its
<a href="#privacy-indicator-requirements">privacy indicator requirements</a>.
</li>
</ul>
</section>
</section>
<section class="appendix">
<h2>Acknowledgements</h2>
Expand Down
Loading