Enhancing User Experience with Responsive Gestures

In this section, we explore how Jetpack Compose can be used to create animations that are controlled by user gestures. We’ll focus on two examples — a multi-touch transformable image and a gesture-controlled audio waveform.

A) Multi Touch Transformable Image

In this example, we’ll create an image view that users can interact with using multi-touch gestures like pinch, zoom, and rotate.

Multi-touch Transformable Image
@Composable
fun TransformableImage(imageId: Int = R.drawable.android) {
var scale by remember { mutableStateOf(1f) }
var rotation by remember { mutableStateOf(0f) }
var offset by remember { mutableStateOf(Offset.Zero) }

Box(modifier = Modifier.fillMaxSize().background(Color.DarkGray), contentAlignment = Alignment.Center) {
Image(
painter = painterResource(id = imageId),
contentDescription = "Transformable image",
contentScale = ContentScale.Crop,
modifier = Modifier
.size(300.dp)
.graphicsLayer(
scaleX = scale,
scaleY = scale,
rotationZ = rotation,
translationX = offset.x,
translationY = offset.y
)
.pointerInput(Unit) {
detectTransformGestures { _, pan, zoom, rotate ->
scale *= zoom
rotation += rotate
offset += pan
}
}
)
}
}

Explanation

  • The Image composable is modified with graphicsLayer to apply transformations like scale, rotation, and translation.
  • The pointerInput with detectTransformGestures is used to handle multi-touch gestures, updating the scale, rotation, and offset accordingly.

B) Gesture Controlled Waveform

Here’s a waveform visualization that changes its appearance based on user gestures, such as swipes and pinches, to control aspects like amplitude and frequency.

Gesture Controlled Waveform
@Composable
fun GestureControlledWaveform() {
var amplitude by remember { mutableStateOf(100f) }
var frequency by remember { mutableStateOf(1f) }

Canvas(modifier = Modifier
.fillMaxSize()
.pointerInput(Unit) {
detectDragGestures { _, dragAmount ->
amplitude += dragAmount.y
frequency += dragAmount.x / 500f
// Adjusting frequency based on drag
}
}
.background(
Brush.verticalGradient(
colors = listOf(Color(0xFF003366), Color.White, Color(0xFF66B2FF))
)
)) {
val width = size.width
val height = size.height
val path = Path()

val halfHeight = height / 2
val waveLength = width / frequency

path.moveTo(0f, halfHeight)

for (x in 0 until width.toInt()) {
val theta = (2.0 * Math.PI * x / waveLength).toFloat()
val y = halfHeight + amplitude * sin(theta.toDouble()).toFloat()
path.lineTo(x.toFloat(), y)
}

val gradient = Brush.horizontalGradient(
colors = listOf(Color.Blue, Color.Cyan, Color.Magenta)
)

drawPath(
path = path,
brush = gradient
)
}
}

Explanation

  • amplitude and frequency are state variables that control the amplitude and frequency of the waveform, respectively.
  • The Canvas composable is used to draw the waveform. The drawing logic inside the Canvas calculates the Y position for each X position based on the sine function, creating a wave effect.
  • The detectDragGestures modifier is used to update amplitude and frequency based on user drag gestures. Horizontal drags adjust the frequency, and vertical drags adjust the amplitude.
  • As the user drags across the screen, the shape of the waveform changes accordingly, creating an interactive experience.

Note

  • This is a basic implementation. For a more realistic audio waveform, you would need to integrate actual audio data.
  • The responsiveness of the waveform to gestures can be fine-tuned by adjusting how amplitude and frequency are modified during the drag.

This example demonstrates how to create a basic interactive waveform in Compose, and it can be extended or modified for more complex use cases or to handle more intricate gestures.

Source link